Test execution DLV environment - NextensArelB/SwaggerGenerationTool GitHub Wiki

Objective To validate whether 1 or more US is working in collaboration with all Nextens functionality and if all Nextens functionality is not impacted by those US (=regression test)

Who QA of US and QA of impacted teams

What Execution of the the US under test and the impacted assets of other teams

When When the testing of 1 or more US on DEV is completed and the functionality is intended to release to production, the functionlaity will be merged to DLV environment where a functional and regression test is executed

How

  • Create overview of changed functionality and impact on other assets and teams (see dependency overview)
  • Inform teams on the upcoming release, the planning, the changes and impact:
    • When there is a small impact, it can be sufficient to share this in the T*R release channel and tag the impacted teams
    • When there is a huge impact, schedule a meeting with the QA's of impacted teams and inform them on the changes and impact and align on planning
  • When functionality is merged to DLV:
    • Prepare test data (if applicable)
    • Perform the test of the US (manual and/or automated)
    • Ask impacted teams to perform their regression test (manual and/or automated)
  • In case of deviations, determine if it is a bug (described in US) or a change (not described in US)
    • Create bug or task on the board
    • Discuss or inform DEV on bug/taks found
  • When testing on DLV is complete, inform DM and rest of the team of the result

Tools

  • Management Portal
  • DevOps tasks or test cases
  • DevOps bugs
  • DoD
⚠️ **GitHub.com Fallback** ⚠️