Testing Pipeline - SoenCapstone/GameOn GitHub Wiki
GameOn is developed using a microservices architecture, consisting of multiple backend services written in Go (e.g., go-user-service, go-team-service, and a Spring Boot API Gateway that handles routing, authentication, and inter-service communication. The frontend mobile app is built using React Native.
This document outlines the testing and continuous integration (CI) strategy adopted by the GameOn team to ensure reliable functionality, maintainable code, and consistent deployment across all services.
GameOn’s testing approach is multi-layered, covering unit, integration, end-to-end, and user acceptance levels. Each test layer focuses on specific aspects of the system, and all are automated through GitHub Actions.
Unit testing ensures that each function or component behaves as expected in isolation.
-
Backend (Go microservices):
Uses Go’s built-intestingpackage and theTestifyframework for assertions and mocks.
Each service includes unit tests for handlers, repositories, and core logic.
-
API Gateway (Spring Boot):
JUnit 5 and Mockito are used to test controller routes, authentication filters, and service integrations.
These tests ensure the gateway properly validates tokens and routes requests to the correct microservice. -
Frontend (React Native):
Uses Jest and React Testing Library to test individual components, hooks, and utility functions.
Goal: Maintain a minimum of 80% test coverage across all services. Code below this threshold will not be merged into main.
Integration testing focuses on verifying communication between components and services.
-
Microservice Interactions:
Conducted using Docker Compose test environments to simulate real inter-service communication via REST APIs through the API Gateway. -
Database Integration:
Each service runs integration tests against a temporary database (PostgreSQL) initialized with Flyway migrations to ensure consistent schemas and data across environments. -
Authentication Integration:
Integration tests validate clerk authentication and authorization, ensuring endpoints correctly handle valid and invalid JWT tokens.
Goal: Confirm that all services communicate seamlessly and securely under production-like configurations.
End-to-end testing validates complete workflows from a user’s perspective, covering both frontend and backend.
-
Tool: Cypress (for mobile web builds and automated UI testing).
Goal: Guarantee that all primary user flows perform correctly across the integrated system.
UAT ensures that the system meets user expectations and functional requirements.
-
Approach:
Conducted during each iteration’s pre-release phase with non-developer participants (e.g., testers or classmates) to simulate real-world use. -
Focus:
Validating usability, navigation, and end-user satisfaction.
Feedback from UAT sessions is used to improve user experience and address usability issues before release.
To ensure system scalability and responsiveness under high usage:
-
Tool: JMeter (for backend load simulation).
-
Goal:
Simulate concurrent users (e.g., multiple players joining events simultaneously) and analyze latency, throughput, and resource utilization.
-
Minimum 80% coverage required for all services (backend, gateway, frontend).
-
Coverage and code quality are tracked through SonarCloud.
-
Quality gates in CI prevent merging if coverage or lint checks fail.
-
Static analysis detects code smells, vulnerabilities, and maintainability issues early in development.
All repositories are integrated with GitHub Actions, which automatically execute tests and quality checks on every pull request or push to main.
Stages in the Workflow:
-
Linting and Static Analysis:
-
Runs
lintfor frontend code. -
Runs SonarCloud analysis for all repositories.
-
-
Unit and Integration Tests:
-
Executes Go tests with coverage reports.
-
Runs JUnit tests for the API Gateway.
-
Executes Jest tests for frontend components.
-
-
End-to-End Testing:
-
Cypress tests run automatically on PRs to validate main user flows.
-
-
Quality Gate Validation:
-
CI enforces test and coverage thresholds before merge.
-
-
Containerization and Deployment:
-
Each service builds its Docker image.
-
CI pipeline deploys to a staging environment for verification.
-
Upon approval, a production deployment is triggered.
-
All code contributions follow a structured pull request (PR) process:
-
Developers create feature branches and open PRs to
main. -
GitHub Actions automatically run all tests and static analysis checks.
-
Each PR requires at least one reviewer approval before merging.
-
CI prevents merging if any test fails or quality gates are unmet.
This ensures that all merged code is reviewed, tested, and compliant with GameOn’s coding standards.