Common Workflows - skindyk/testrail-mcp-server GitHub Wiki
This guide provides step-by-step workflows for common TestRail operations using the MCP server. Each workflow includes the natural language commands and expected outcomes.
🚀 Getting Started Workflows
1. Initial Project Setup
Goal: Set up a new testing project from scratch
Step 1: "Create a new project called 'Mobile App v2.0 Testing'"
Step 2: "Create a test suite called 'Core Functionality' in project [project_id]"
Step 3: "Add sections for different feature areas"
Step 4: "Create test case templates for the project"
Expected Outcome: New project with organized structure ready for test case creation.
2. Project Discovery
Goal: Understand existing TestRail setup
Step 1: "Get all projects from TestRail"
Step 2: "Show me details for project [project_name]"
Step 3: "List all test suites in project [project_id]"
Step 4: "Get sections and their organization"
Expected Outcome: Complete understanding of project structure and organization.
📝 Test Case Management Workflows
3. Creating Test Cases
Goal: Add new test cases to an existing project
Step 1: "Get sections for project [project_id] to find the right location"
Step 2: "Create a new test case in section [section_id] with title 'User Login Validation'"
Step 3: "Update the test case with detailed steps and expected results"
Step 4: "Set priority and custom fields as needed"
Expected Outcome: Well-structured test case ready for execution.
4. Bulk Test Case Operations
Goal: Efficiently manage multiple test cases
Step 1: "Get test cases from project [project_id] with field filtering - return only id, title, section_id"
Step 2: "Copy test cases [case_ids] to section [target_section_id]"
Step 3: "Update multiple test cases with consistent priority and labels"
Step 4: "Verify the changes were applied correctly"
Expected Outcome: Multiple test cases organized and updated efficiently.
5. Test Case Maintenance
Goal: Keep test cases up-to-date and organized
Step 1: "Get test cases updated in the last 30 days"
Step 2: "Review test case history for recent changes"
Step 3: "Update outdated test cases with new requirements"
Step 4: "Move test cases to appropriate sections if needed"
Expected Outcome: Current and well-organized test case repository.
🏃 Test Execution Workflows
6. Simple Test Run
Goal: Execute a basic test run
Step 1: "Create a new test run for project [project_id] called 'Sprint 24 Testing'"
Step 2: "Get tests in the run to see what needs to be executed"
Step 3: "Add passed result for test case [case_id] with execution notes"
Step 4: "Add failed results with defect information where needed"
Step 5: "Close the test run when execution is complete"
Expected Outcome: Completed test run with documented results.
7. Regression Testing
Goal: Execute regression test suite
Step 1: "Get test cases with label 'regression' from project [project_id]"
Step 2: "Create a test run including all regression test cases"
Step 3: "Execute tests systematically, adding results as you go"
Step 4: "Generate regression test report"
Step 5: "Update test case priorities based on failure patterns"
Expected Outcome: Completed regression testing with actionable insights.
📋 Test Planning Workflows
8. Release Planning
Goal: Plan comprehensive testing for a release
Step 1: "Create a milestone for 'Release 2.0' with target date"
Step 2: "Create a test plan called 'Release 2.0 Comprehensive Testing'"
Step 3: "Add multiple plan entries for different testing phases"
Step 4: "Assign test runs to different team members"
Step 5: "Track progress and update plan as needed"
Expected Outcome: Well-organized release testing plan with clear assignments.
9. Sprint Planning
Goal: Plan testing activities for an agile sprint
Step 1: "Get test cases related to sprint user stories"
Step 2: "Create a test plan for 'Sprint 24'"
Step 3: "Organize test cases by feature and priority"
Step 4: "Estimate testing effort and assign resources"
Step 5: "Set up automated result reporting for CI/CD"
Expected Outcome: Sprint testing plan aligned with development activities.
📊 Reporting and Analysis Workflows
10. Test Coverage Analysis
Goal: Understand test coverage and gaps
Step 1: "Get all test cases for project [project_id] with requirements mapping"
Step 2: "Analyze test case distribution across features"
Step 3: "Identify areas with insufficient test coverage"
Step 4: "Generate coverage report for stakeholders"
Step 5: "Plan additional test cases for coverage gaps"
Expected Outcome: Clear understanding of test coverage with action plan.
11. Quality Metrics Reporting
Goal: Generate quality insights for management
Step 1: "Get test execution results for the last month"
Step 2: "Analyze pass/fail rates by feature area"
Step 3: "Identify recurring failure patterns"
Step 4: "Generate executive summary report"
Step 5: "Recommend quality improvement actions"
Expected Outcome: Data-driven quality insights with recommendations.
🔧 Advanced Workflows
12. BDD Integration
Goal: Integrate with BDD development process
Step 1: "Export existing test cases as BDD scenarios"
Step 2: "Review and refine Gherkin format"
Step 3: "Import updated BDD scenarios back to TestRail"
Step 4: "Link BDD scenarios to automated test execution"
Step 5: "Maintain synchronization between BDD and TestRail"
Expected Outcome: Seamless BDD workflow with TestRail integration.
13. Custom Field Management
Goal: Leverage custom fields for enhanced tracking
Step 1: "Get available custom fields for the project"
Step 2: "Update test cases with custom field values"
Step 3: "Use custom fields for advanced filtering and reporting"
Step 4: "Create custom reports based on field values"
Step 5: "Maintain custom field consistency across test cases"
Expected Outcome: Enhanced test case metadata and reporting capabilities.
💡 Best Practices
Workflow Optimization Tips
- Start Small: Begin with basic operations before complex workflows
- Use Field Filtering: Always specify needed fields for large datasets
- Batch Operations: Group related operations for efficiency
- Documentation: Document custom workflows for team use