Section 11 ‐ Quality Assurance (QA) & Testing - ApertureViewer/Aperture-Opertations-Manual GitHub Wiki
Quality Assurance (QA) is an integral part of the Aperture Viewer development lifecycle, dedicated to ensuring that each release is stable, reliable, erforms acceptably, and delivers on its promise of visual excellence and creator empowerment. The QA process aims to identify and address defects, usability issues, and regressions before they reach end-users.
11.1 QA Strategy and Philosophy
Aperture Viewer's QA strategy is built upon the following core principles:
- Early and Continuous Testing: QA involvement begins early in the development cycle. New features and significant changes are discussed with QA to identify potential testing challenges and develop preliminary test approaches. Testing is an ongoing activity, not just a final phase.
- User-Centric Approach: Testing focuses on the end-user experience, particularly for the target audience of visual creators. Usability, workflow efficiency for photographers/videographers, and visual fidelity are key considerations.
- Risk-Based Testing: Testing efforts are prioritized based on the risk associated with new features or changes, the criticality of affected components, and the potential impact on users.
- Comprehensive Coverage: While exhaustive testing of every permutation is impossible, QA strives for comprehensive coverage of core functionalities, new features, and areas impacted by recent changes. This includes functional testing, performance testing, visual regression testing, and usability testing.
- Collaboration: Close collaboration between the Head of QA, the Project Lead (developer), and any other contributors (e.g., beta testers, community members providing feedback) is essential for effective QA.
- Clear Defect Reporting and Tracking: All identified issues are reported clearly, concisely, and with sufficient detail to allow for efficient diagnosis and resolution, primarily via the GitHub Issues tracker.
- Regression Prevention: A key goal is to prevent previously fixed bugs from reappearing in later releases. Regression testing forms an important part of the pre-release process.
- Iterative Improvement: QA processes themselves are subject to review and iterative improvement based on experience and feedback.
11.2 Testing Types Employed
Aperture Viewer utilizes a variety of testing types throughout its development lifecycle:
- Developer Testing (Unit/Component):
- Performed By: Project Lead (William Weaver) during initial development of features or bug fixes.
- Scope: Testing individual code units, modules, or components in isolation to verify their correctness.
- Focus: Ensuring the foundational building blocks of a feature work as expected before integration.
- Integration Testing:
- Performed By: Project Lead, potentially assisted by Head of QA.
- Scope: Testing the interaction between newly developed/modified components and existing parts of the viewer.
- Focus: Identifying issues that arise when different parts of the codebase are combined.
- Functional Testing:
- Performed By: Head of QA, QA team members (if any).
- Scope: Verifying that features and functionalities of the viewer operate according to their specified requirements and user expectations.
- Focus: Does the feature do what it's supposed to do? Are all user interactions handled correctly?
- Visual Fidelity Testing:
- Performed By: Head of QA, Project Lead, visually astute testers.
- Scope: Assessing the visual output of the viewer, particularly for new rendering features, changes to lighting/shadows/post-processing, and ensuring no unintended visual artifacts are introduced.
- Focus: Does it look right? Does it meet Aperture's standards for visual excellence?
- Performance Testing:
- Performed By: Head of QA, Project Lead, testers with varying hardware.
- Scope: Measuring and evaluating the viewer's performance (FPS, startup time, resource usage like VRAM/CPU) under various conditions, especially after changes that could impact performance.
- Focus: Does it perform acceptably? Have new features introduced significant regressions? (Refer to specific Deep Dive documents like "Default FPS and VSync Settings Analysis" for performance considerations).
- Usability Testing:
- Performed By: Head of QA, target audience representatives.
- Scope: Evaluating the ease of use, intuitiveness, and efficiency of the user interface and workflows, especially for new features or modified UI elements.
- Focus: Is it easy to understand and use? Does it enhance or hinder the creative workflow?
- Regression Testing:
- Performed By: Head of QA, QA team.
- Scope: Re-testing previously fixed bugs and core functionalities after new changes are integrated to ensure that existing functionality has not been broken (regressed).
- Focus: Preventing the reintroduction of old problems.
- Build Verification Testing (BVT) / Smoke Testing:
- Performed By: Project Lead (after a build), Head of QA (on receiving a test build).
- Scope: A quick set of tests on a newly compiled build to ensure it installs, launches, allows login, and that core functionalities are broadly working.
- Focus: Is the build stable enough for more detailed testing?
- Alpha/Beta Testing (Community Involvement):
- Performed By: Selected community members and interested users on pre-release builds.
- Scope: Broader testing across a wider range of hardware, use cases, and content.
- Focus: Identifying issues, gathering feedback on new features, and assessing real-world usability before a public release.
11.3 Test Plan Development
For significant new features, major releases, or complex changes, the Head of Quality Assurance, in collaboration with the Project Lead, will endeavor to develop formal or informal test plans.
- Purpose of Test Plans:
- To define the scope of testing for a specific feature or release.
- To identify key functionalities and test scenarios to be covered.
- To outline testing methodologies and acceptance criteria.
- To serve as a checklist for testers.
- Content of a Test Plan (may vary in formality):
- Introduction/Overview: Brief description of the feature/release being tested.
- Scope of Testing: What will be tested, and what (if anything) is explicitly out of scope.
- Test Objectives: What the testing aims to achieve (e.g., verify functionality X, assess performance of Y, ensure no regressions in Z).
- Features to be Tested: A list of specific features or functionalities.
- Test Cases/Scenarios: High-level descriptions of tests to be performed, including steps and expected outcomes. (Detailed step-by-step test cases may be developed for critical or complex areas).
- Test Environment: Recommended hardware/software configurations for testing (if specific).
- Pass/Fail Criteria: How to determine if a test case passes or fails.
- Reporting: How and where to report issues found.
- Development and Review:
- Test plans will be drafted by the Head of QA.
- The Project Lead will review test plans for technical accuracy and completeness regarding the features implemented.
- Test plans will be shared with any involved testers.
11.4 Bug Reporting and Triage Process
Effective bug reporting and triage are crucial for addressing issues efficiently.
- Primary Reporting Channel: GitHub Issues:
- All suspected bugs discovered during QA testing or reported by users should be formally logged as issues in the
ApertureViewer
GitHub Issues tracker. - Refer to Appendix [M] (GitHub Issue Management) for detailed guidelines on creating and managing issues.
- All suspected bugs discovered during QA testing or reported by users should be formally logged as issues in the
- Information to Include in Bug Reports:
- Clear, concise title summarizing the bug.
- Aperture Viewer version number.
- Operating System and version.
- Graphics card model and driver version.
- Detailed, step-by-step instructions to reliably reproduce the bug.
- What was expected to happen.
- What actually happened.
- Screenshots or short videos illustrating the bug (highly recommended).
- Relevant portions of the
ApertureViewer.log
file (found in%AppData%\ApertureViewer\logs\
on Windows). - Any specific settings or conditions that trigger the bug.
- Initial Triage (Head of QA / Project Lead):
- New issues will be reviewed (triaged) promptly.
- Verification: Attempt to reproduce the reported bug.
- Clarification: If more information is needed, request it from the reporter (add
needs-info
label). - Prioritization: Assign a priority level (e.g.,
Critical
,High
,Medium
,Low
) based on severity, impact, and frequency. - Labeling: Apply relevant labels (e.g.,
bug
,crash
,ui
,performance
,visual-artifact
, feature area likephototools
orstars
). - Duplicate Check: Check if the issue is a duplicate of an existing report. If so, link and close the duplicate.
- Assignment: Assign the issue to the Project Lead (developer) if it's a confirmed bug requiring a code fix, or to the Head of QA for further investigation if needed.
- Bug Lifecycle:
- Issues progress through states like
Open
,In Progress
(when being worked on),Pending QA Verification
(when a fix is committed),Closed
(when verified as fixed or resolved). - Communication on the GitHub issue should be maintained throughout its lifecycle.
- Issues progress through states like
11.5 Release Candidate (RC) Testing Process
Before a new version of Aperture Viewer is publicly released, it will typically go through a Release Candidate (RC) testing phase.
- Purpose: To conduct final, comprehensive testing on a build that is intended to be the public release version, ensuring it meets quality standards and identifying any critical, release-blocking issues.
- Build Preparation: The Project Lead will prepare an RC build from the
release/vX.Y.Z
branch. This build will include all features and fixes planned for the upcoming release. - Internal RC Testing (Core Team):
- The Head of QA and the Project Lead (and potentially other Core Leadership members) will conduct intensive testing of the RC build.
- This phase focuses on:
- Executing key test plans and regression test suites.
- Verifying all new features and major changes.
- Checking for any critical bugs or showstoppers.
- Final performance and stability checks.
- Beta Tester RC Testing (Optional but Recommended):
- If a group of trusted beta testers is established, the RC build may be distributed to them for a short period of focused testing.
- Beta testers will be asked to focus on specific areas or conduct general usage testing and report any issues immediately.
- Issue Reporting during RC Phase:
- Any issues found during RC testing must be reported immediately via GitHub Issues and flagged as critical if they are potential release blockers.
- Go/No-Go Decision:
- Based on the results of RC testing, the Head of QA will provide a quality assessment and recommendation to the Project Lead.
- If critical issues are found, a new RC build (e.g., RC2) incorporating fixes will be prepared, and the RC testing cycle will repeat.
- The Project Lead makes the final "Go/No-Go" decision for releasing the RC build as the official public version.
- Changelog Finalization: During the RC phase, the official changelog for the release will be finalized by the Project Lead, with input from the Head of QA regarding fixed issues.
This QA and Testing framework is designed to be adaptable to the project's scale and resources, with a constant focus on delivering a high-quality, innovative, and enjoyable experience for Aperture Viewer users. The Head of Quality Assurance will be responsible for the day-to-day implementation and refinement of these processes.