Sprint 2 ‐ MVP V2 - JeanCarloLondo/SpectRA GitHub Wiki
Second version of the Business Plan
6. Marketing & Sales Strategy
Objective
Make SpectRA known and adopted across the university community (students, visitors, staff) and establish initial traction for a campus-wide pilot at EAFIT. Prepare a repeatable playbook that can be used to approach other universities later.
Target audience
-
Primary: New students and campus visitors (orientation week, open houses).
-
Secondary: University staff and faculty (administrative adoption, content providers).
-
Tertiary: Prospective students (marketing material for recruitment).
Value proposition for the audience
-
For students & visitors: Instant, mobile AR wayfinding and contextual building information without extra hardware.
-
For staff & university: Modern campus engagement tool for orientation, events, outreach and digital archiving of building metadata.
Go-to-market channels & tactics
**Campus Launch Event & Demos (High impact)
-
Organize a booth during orientation week and at selected campus events. Live demo with QR codes for quick APK/install and on-device trials.
-
KPI: installs per event, demo-to-install conversion.
University Communication Channels (Low cost, trusted)
-
Coordinate with university communications to publish a feature in the campus newsletter, official social media (Instagram or Facebook), and the university website (Interactiva or Epik).
-
KPI: referral traffic from official channels.
Campus Ambassadors / Peer Outreach
-
Recruit a small group of student ambassadors to promote the app in faculties and student groups (word of mouth + social posts).
-
KPI: installs from ambassador promo codes.
Social Media Ads & Content (Targeted)
-
Short demo clips showing AR overlays and wayfinding. Targeted ads toward first-year students and campus visitors.
-
KPI: cost per install, engagement rate.
Posters & QR Code Signage (Ongoing physical presence)
-
Place posters and stickers near entrances, information desks, and event booths. Include QR to download or access remote exploration mode.
-
KPI: scans per poster.
Workshops for Staff & Admins
-
Short training sessions for administrative staff on how to update content in CMS and how to use admin verification features.
-
KPI: number of verified entries created by admins.
Pilot Partnerships & Testimonials
-
Partner with the Admissions Office and Campus Tours to integrate SpectRA into formal tours. Collect testimonials and usage metrics for future sales to other universities.
-
KPI: pilot adoption rate, number of departments using SpectRA.
Pricing & Sales model (early thinking)
-
Pilot / Institutional License: Offer the university an institutional license for premium features (e.g., admin panel, priority support, custom content integration). Pricing negotiable; initial pilots could be free or highly discounted in exchange for access/data and official endorsement.
-
Service Revenue: Paid content integration or custom 3D modeling services for departments that want advanced AR content.
-
Freemium (for eventual public release): Core features free to students; premium content (campus-exclusive multimedia packages, analytics) behind a paid tier for other institutions.
Marketing timeline (first 3 months)
-
Week 1–2: Prepare marketing materials (demo video, posters, one-pager).
-
Week 3: Soft launch with ambassadors + staff workshop.
-
Week 4: Official launch event during orientation or a major campus day.
-
Month 2–3: Social ads + university channels + pilot feedback collection.
KPIs to measure success
-
Number of downloads / installs
-
Active users (weekly)
-
Average session length (time using AR)
-
Conversion rate from demos to installs
-
Number of verified building entries (admin adoption)
-
Cost per install (for paid channels)
-
Budget note (included in Section 7)
-
Marketing items (launch event, posters, social ads, swag) are included in the Finance table under “Marketing & Sales”. Budget allocation considers a student-project scale pilot.
7. Finances
Pre-operation Budget (Estimated)
Funding Sources
Own contribution: Team-funded items.
University: Covers all remaining project expenses.
Budget Breakdown
| ITEM (RUBROS) | OWN (COP / USD) | UNIVERSITY (COP / USD) | TOTAL (COP / USD) |
|---|---|---|---|
| PERSONNEL (stipends, testers, small dev compensation) | 1,173,315 COP (300 USD) | 10,559,835 COP (2,700 USD) | 11,733,150 COP (3,000 USD) |
| EQUIPMENT (test smartphone, external storage, small sensors) | 782,210 COP (200 USD) | 5,084,365 COP (1,300 USD) | 5,866,575 COP (1,500 USD) |
| SOFTWARE (Unity / paid tiers, Firebase paid tier, dev tools) | 391,105 COP (100 USD) | 1,564,420 COP (400 USD) | 1,955,525 COP (500 USD) |
| MATERIALS (posters, printing, signage, QR stickers) | 391,105 COP (100 USD) | 782,210 COP (200 USD) | 1,173,315 COP (300 USD) |
| FIELD TRIPS (transport for capturing images, data collection) | 195,552 COP (50 USD) | 586,658 COP (150 USD) | 782,210 COP (200 USD) |
| BIBLIOGRAPHIC (books, paid articles, subscriptions) | 391,105 COP (100 USD) | 0 COP (0 USD) | 391,105 COP (100 USD) |
| PUBLICATIONS / PATENTS / SOFTWARE REGISTRATION | 0 COP (0 USD) | 1,564,420 COP (400 USD) | 1,564,420 COP (400 USD) |
| TECHNICAL SERVICES (cloud credits, model training, optimization) | 782,210 COP (200 USD) | 3,911,050 COP (1,000 USD) | 4,693,260 COP (1,200 USD) |
| TRAVEL (small allocation for conference/presentation travel) | 195,552 COP (50 USD) | 2,151,078 COP (550 USD) | 2,346,630 COP (600 USD) |
| MARKETING & SALES (launch event, social ads, demo video, swag) | 782,210 COP (200 USD) | 3,911,050 COP (1,000 USD) | 4,693,260 COP (1,200 USD) |
| GRAND TOTAL | 5,084,364 COP (1,300 USD) | 30,115,086 COP (7,700 USD) | 35,199,450 COP (9,000 USD) |
Short Budget Justification
-
Personnel: Small stipends/incentives to pay data labelers/testers and partially compensate final development work. Team contribution shows commitment.
-
Equipment: Prefer university equipment loans; small cash allocation for a spare test device and portable storage.
-
Software / Technical services: Firebase paid tiers, cloud compute credits for ML training, and small software licenses where necessary. University typically provides educational licenses/credits but budgeted conservatively.
-
Marketing & Sales: Physical launch materials (posters, QR stickers), demo booth, light social media ads, and small swag for ambassadors to generate adoption.
-
Publications / Registration: Low-priority allocation for any formal registration or publication fees (optional).
-
Travel / Field Trips: Local transport for on-site data collection and a small travel fund for project presentations.
Operation analysis — Operational budget & break-even
1) Summary — purpose
This section estimates operational monthly costs (fixed and variable), projects how many paying customers or licenses are required to break even, and suggests pricing / revenue scenarios suitable for a university-pilot AR product like SpectRA.
2) Fixed monthly costs (COP & USD)
Fixed costs are those that do not depend on number of users in the short term.
| Fixed cost item | Monthly (COP) | Monthly (USD ≈) |
|---|---|---|
| Personnel (part-time maintainer / support / small stipends) | 2,000,000 | $511 |
| Cloud & Firebase hosting | 500,000 | $128 |
| Content delivery & media storage (CDN) | 300,000 | $77 |
| Marketing & community (ongoing ads / materials) | 300,000 | $77 |
| Software licensing / admin tools | 200,000 | $51 |
| Device depreciation & test hardware (monthly equivalent) | 200,000 | $51 |
| Subtotal (base fixed) | 3,500,000 | $895 |
| Contingency (10%) | 350,000 | $89 |
| TOTAL fixed monthly | 3,850,000 COP | ≈ $984.39 USD |
3) Variable costs (per active user / per month)
Variable costs scale with the number of active users (bandwidth, storage growth, on-demand support).
Estimated variable cost per active user / month: 500 COP (≈ $0.128)
— covers data transfer, thumbnails, small support & logging costs.
Examples:
-
500 active users ⇒ variable = 500 × 500 = 250,000 COP (≈ $63.92) Total monthly cost = 3,850,000 + 250,000 = 4,100,000 COP (≈ ,048.31)
-
1,000 active users ⇒ variable = 500,000 COP (≈ $127.84) Total monthly = 4,350,000 COP (≈ ,112.23)
-
2,000 active users ⇒ variable = 1,000,000 COP (≈ $255.69) Total monthly = 4,850,000 COP (≈ ,240.08)
First year of operation
Assumptions (sales plan used to build the table):
-
Pilot launch during Month 1 (unpaid pilot / university-supported; pilot license not counted as paid initially).
-
A mixture of one-time service sales (3D models, workshops) and institutional license sales throughout the year.
-
When a license (Standard or Premium) is sold, its amortized monthly revenue (annual/12) counts for the remaining months of the year from the sale month onward.
-
Support packages are monthly recurring from the month sold.
Below table columns:
-
New units sold this month (Standard / Premium / 3D models / Workshops / New support subscriptions)
-
Cumulative active recurring contracts (Standard / Premium / Support)
-
Recurring revenue this month (COP)
-
One-time revenue this month (COP)
-
Total revenue this month (COP and USD ≈)
(Exchange rate used to convert totals to USD: 1 USD = 3,911.05 COP)
Month-by-month table
| Month | New Std sold | New Prem sold | New 3D models | New Workshops | New support subs | Cum Std | Cum Prem | Cum Support | Recurring (COP) | One-time (COP) | Total (COP) | Total (USD ≈) |
|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | $0.00 |
| 2 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 500,000 | 500,000 | $127.84 |
| 3 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 2,000,000 | 2,000,000 | $511.37 |
| 4 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1,666,667 | 0 | 1,666,667 | $426.14 |
| 5 | 0 | 0 | 1 | 0 | 1 | 1 | 0 | 1 | 1,666,667 | 2,000,000 | 3,666,667 | $937.61 |
| 6 | 1 | 0 | 0 | 0 | 0 | 2 | 0 | 1 | 3,333,333 | 0 | 3,333,333 | $852.29 |
| 7 | 0 | 0 | 0 | 2 | 0 | 2 | 0 | 1 | 3,333,333 | 1,000,000 | 4,333,333 | $1,107.79 |
| 8 | 1 | 0 | 0 | 0 | 0 | 3 | 0 | 1 | 5,000,000 | 0 | 5,000,000 | $1,279.09 |
| 9 | 0 | 1 | 0 | 0 | 0 | 3 | 1 | 1 | 9,166,667 | 0 | 9,166,667 | $2,343.04 |
| 10 | 0 | 0 | 2 | 0 | 0 | 3 | 1 | 1 | 9,166,667 | 4,000,000 | 13,166,667 | $3,365.57 |
| 11 | 1 | 0 | 0 | 0 | 0 | 4 | 1 | 1 | 10,833,333 | 0 | 10,833,333 | $2,770.53 |
| 12 | 0 | 0 | 0 | 3 | 1 | 4 | 1 | 2 | 11,833,333 | 1,500,000 | 13,333,333 | $3,410.57 |
Annual total revenue (sum of 12 months): 70,500,000 COP ≈ 8,025.85 USD
How the recurring revenue was computed (clarity)
-
Standard license monthly contribution = 20,000,000 / 12 = 1,666,667 COP / month.
-
Premium license monthly contribution = 50,000,000 / 12 = 4,166,667 COP / month.
-
Each time a new license is sold, that monthly contribution is added to the recurring revenue from that month onward.
-
Support subs add 500,000 COP to recurring revenue per support client per month.
-
One-time sales (3D models and workshops) are counted entirely in the month sold.
Key results & interpretation
To- tal projected first-year revenue: 70,500,000 COP (~$18,025.85 USD).
-
Break-even on monthly fixed costs (estimated fixed monthly ≈ 3,850,000 COP): Using the monthly totals above, the plan reaches monthly revenue ≥ fixed cost in Month 5 (3,666,667 COP in Month 5 is slightly below 3,850,000 COP; Month 6 and afterwards exceed the fixed baseline).
-
Concretely, months where monthly revenue ≥ 3,850,000 COP are: Months 6, 7, 8, 9, 10, 11, 12 — so by mid-year the operation becomes revenue-positive relative to base fixed costs (assuming the university no longer subsidizes operations).
-
Fastest path to cover monthly operations is landing 2–3 Standard licenses or a single Premium license (see earlier break-even analysis). In this projection, landing one Premium in Month 9 produces a significant recurring revenue jump.
Type of product or service to be offered and sales value per unit or customer:
| Product / Service | Price per unit (COP) | Price per unit (USD ≈) | Billing / Note |
|---|---|---|---|
| Pilot institutional license (introductory) | 6,000,000 COP | $1,534.11 | Annual pilot license — discounted; useful for initial campus rollout |
| Standard institutional license | 20,000,000 COP | $5,113.72 | Annual license — recommended for full campus deployment |
| Premium institutional license | 50,000,000 COP | $12,784.29 | Annual, includes premium features, priority support, customization |
| 3D model (per building) | 2,000,000 COP | $511.37 | One-time fee to create a high-quality 3D asset for a building |
| Training workshop (per session) | 500,000 COP | $127.84 | One-time on-site or virtual workshop for admin/staff (per session) |
| Support package | 500,000 COP / month | $127.84 / month | Monthly recurring support (SLAs, content assistance) |
| Per-student subscription (optional) | 1,000 COP / student / month | $0.26 / student / month | Low-cost per-seat option |
Short pricing notes
-
Licenses are annual and may be invoiced upfront or billed in agreed installments—clarify payment terms in contracts.
-
3D modeling and workshops are one-time professional services and work well as upsells when negotiating institutional licenses.
-
Support package is recurring and provides predictable monthly revenue; consider bundling with Standard/Premium licenses at a discount.
-
Per-student fees are optional and useful if an institution prefers per-seat billing; they require high adoption to be a meaningful revenue stream.
Projection in the first year of operation:
Monthly Financial Projection
| Month | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 | 12 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|
| Estimated Clients or Sales | 1 | 1 | 2 | 2 | 3 | 3 | 4 | 4 | 5 | 5 | 6 | 6 |
| Revenues (COP) | 6,000,000 | 6,000,000 | 12,000,000 | 12,000,000 | 18,000,000 | 18,000,000 | 24,000,000 | 24,000,000 | 30,000,000 | 30,000,000 | 36,000,000 | 36,000,000 |
| Expenses (COP) | 3,500,000 | 3,500,000 | 3,800,000 | 3,800,000 | 4,200,000 | 4,200,000 | 4,600,000 | 4,600,000 | 5,000,000 | 5,000,000 | 5,400,000 | 5,400,000 |
| Total (COP) | Clients: 42 | Revenue: 252,000,000 | Expenses: 53,000,000 | Net Profit: 199,000,000 COP |
Notes:
-
Estimated Clients: represents institutions or paying entities (e.g., universities or schools) adopting SpectRA licenses or pilot services.
-
Revenues: assume most are pilot institutional licenses (6,000,000 COP each), with some upgraded to standard licenses mid-year.
-
Expenses: include staff stipends, minor hardware/software costs, and marketing materials.
-
Net Profit: positive due to university sponsorship covering initial R&D and infrastructure.
Balance and break-even point
Annual Projection Summary
Marketing and Sales Strategy
-
Target Audience: University institutions and educational organizations seeking innovative and immersive ways to enhance student learning through augmented reality.
-
Justification: The educational technology (EdTech) market is rapidly expanding, with institutions actively adopting digital and immersive learning solutions. This aligns with SpectRA’s value proposition of offering interactive and spatially contextualized experiences.
-
Promotion and Sales Channels:
-
Partnerships with universities through innovation and digital transformation offices.
-
Demonstration events, educational fairs, and pilot programs.
-
Digital marketing via LinkedIn, academic forums, and university websites.
-
Word-of-mouth promotion through students and professors involved in pilot tests.
Finance
-
Pre-operation Budget: Includes all necessary items such as equipment (computers, testing devices), software licenses, materials for prototypes, technical services, and legal registration costs.
-
Operating Costs: Divided into fixed (staff, utilities, hosting, administrative expenses) and variable (modeling services, travel, materials) costs.
-
Total Pre-operational Budget: COP 19,700,000 (≈ USD 5,040), fully covered by university support and the founding team’s contribution.
Projection
-
Total Annual Revenues: COP 252,000,000 (≈ USD 64,430).
-
Total Annual Expenses: COP 53,000,000 (≈ USD 13,550).
-
Net Annual Profit: COP 199,000,000 (≈ USD 50,880).
-
Break-even Point: Achieved in Month 1, due to institutional contracts and low operating overhead.
-
Growth Outlook: Gradual increase in institutional clients (from 1 to 6 per year), with proportional growth in 3D model and training service sales.
Balance
| Category | Description |
|---|---|
| Assets | Software infrastructure, development equipment, and intellectual property. |
| Liabilities | None in the first year (project supported by the university). |
| Equity | Founders’ contribution in development effort and technical assets. |
| Financial Structure | Healthy and sustainable — no external debt, positive cash flow, and growing profit margins. |
Bugs Reported
During test execution, the following issues were identified:
| User Story | Test Case ID | Description | Status | Bug ID / Notes |
|---|---|---|---|---|
| HU #5 – See buildings info | CP5.2 | Fallback for Missing Data. | ❌ Fail | Bug #3 - the information for non-existing schedule is not provided |
| HU #5 – See buildings info | CP5.3 | Data Accuracy | ✅ Pass (after fix) | Inthe moment that a building is detected, the data is correctly showed |
| HU #5 – See buildings info | CP5.4 | Close Information Overlay should be easy to execute | ✅ Pass | The clos buttom should be easily pressed in the overlay |
| HU #12 - View service schedules | CP12.1 | Display of Operating Hours | ✅ Pass | the hours that were showed in the first attempt where accurate |
| HU #12 - View service schedules | CP12.2 | Real-Time Data Refresh | ❌ Fail | The information is not refreshing in the app and requires reopening the app |
| HU #20 - Trust displayed data | CP20.1 | Display of Last Update Date | ✅ Pass | The information showed in the app is always the newest on the data-code |
| HU #20 - Trust displayed data | CP20.3 | Data Integrity on Sync | ✅ Pass | We tried to refresh the app at a big rate and the app always showed the most recent information without colapsing |
Usability Testing Protocol
1. Objective
The goal of this usability test is to evaluate how effectively and intuitively users can explore building information within the AR Campus Explorer app. The test focuses on verifying whether users can understand the interface, access relevant information, and trust the data displayed about campus buildings.
2. Participants
-
Target users: University visitors and students.
-
Profile: Individuals unfamiliar with all campus buildings but with basic mobile app experience.
-
Sample size: 5–7 participants (initial evaluation phase).
3. Test Environment
-
Device: Android smartphone with AR capabilities.
-
Setting: Outdoor campus area (near Building 19 and other landmarks).
-
Prototype version: Sprint 2 functional build (AR recognition and info overlays enabled).
4. Tasks
| # | Task Description | Expected Outcome |
|---|---|---|
| T1 | Open the app and point the camera at a building to trigger recognition. | The system identifies the building and displays its name within 3 seconds. |
| T2 | Observe the building information overlay and read its details (name, type, description). | The overlay appears clearly and the text is legible. |
| T3 | Tap on the overlay to view extended building details (e.g., services and schedule). | A detailed info panel is displayed with correct and current data. |
| T4 | Explore available multimedia content (photos/videos) for the recognized building. | User can view and navigate the content without confusion. |
| T5 | Compare information between two buildings by moving the camera from one to another. | Overlay updates dynamically with the new building’s information. |
| T6 | Return to the main screen and close the information panel. | The user exits smoothly without errors or confusion. |
5. Hypotheses
| ID | Hypothesis |
|---|---|
| H1 | Users will successfully recognize a building and access its details within 10 seconds. |
| H2 | The information overlay will be perceived as clear and visually aligned with the real-world view. |
| H3 | At least 80% of users will find the process of exploring building information intuitive. |
| H4 | The trust in displayed data increases when the content includes verified schedules and services. |
| H5 | The inclusion of multimedia (photos/videos) will enhance user satisfaction by ≥20%. |
6. Research Questions
| ID | Question |
|---|---|
| Q1 | Do users understand how to trigger building recognition easily? |
| Q2 | Can users clearly read and interpret the overlaid information? |
| Q3 | Is the transition between buildings smooth and consistent? |
| Q4 | How confident do users feel about the accuracy of the data displayed? |
| Q5 | Does the multimedia content improve engagement and comprehension? |
7. Evaluation Criteria
| Metric | Description | Measurement Method | Target |
|---|---|---|---|
| Task Success Rate | Percentage of tasks completed correctly. | Observation checklist | ≥ 90% |
| Recognition Time | Time between pointing the camera and seeing the name. | Stopwatch / app logs | ≤ 3 seconds |
| Readability & Alignment | Clarity of overlay text and accuracy of AR positioning. | Participant feedback | ≥ 4/5 satisfaction |
| User Satisfaction | Overall user experience rating. | Post-test questionnaire | ≥ 4/5 |
| Error Rate | Frequency of recognition or navigation errors. | Observation | ≤ 10% |
| Trust Level | User confidence in the accuracy of displayed information. | Likert scale | ≥ 4/5 |
8. Data Collection Methods
-
Direct observation during task completion.
-
Screen recording (if allowed).
-
Post-test questionnaire with Likert-scale and open-ended questions.
-
Notes on behavioral cues (hesitations, confusion, corrections).
9. Expected Results
-
The usability test is expected to confirm that:
-
Users can identify buildings quickly and accurately.
-
The overlay and info display are easy to read and interact with.
-
Trust and satisfaction increase as the app shows verified data and multimedia content.
Automated Software Testing Strategy
Recommended Testing Tools (Unity)
| Type of Test | Recommended Tool | Description |
|---|---|---|
| Unit Tests | Unity Test Framework (NUnit-based) | For testing individual scripts, logic, and algorithms in isolation. |
| Integration Tests | Unity Test Runner + Play Mode Tests | For verifying interactions between AR modules, ML recognition, and UI components. |
| E2E (End-to-End) | AltTester Unity SDK | For simulating full user flows within the Unity app (camera → recognition → overlay → info display). |
| Performance Tests | Unity Performance Testing Package | For validating recognition time, rendering FPS, and memory usage. |
Testing Coverage Table
| Functionality | Type of Test | Justification |
|---|---|---|
| Recognize buildings | Unit / Performance | Tests the ML model accuracy and response time. Automated tests validate that recognition accuracy stays >70% and responses occur within <3 seconds. |
| Visualize structures with ML insights | Integration / E2E | Ensures the ML output correctly triggers AR visualizations and overlays in real time, maintaining alignment with real-world objects. |
| View contextual overlays | Integration | Confirms that recognized buildings dynamically display accurate overlays with readable text and updated metadata. |
| See building info | Integration / Unit | Verifies that the app retrieves and displays updated building information (name, hours, services) after recognition. |
| View service schedules | Integration | Checks that schedule data is fetched and formatted correctly from the backend when a building is recognized. |
| Explore history | Unit / Integration | Ensures that verified historical content is retrieved and displayed correctly when the user accesses building history. |
| Access multimedia content | Integration / E2E | Validates that photos and videos are loaded efficiently and match the selected building context. |
| Trust displayed data | Property / Integration | Automatically checks data freshness and validation flags from the backend to guarantee information reliability. |
| Provide real-time interaction | Performance / E2E | Measures latency between recognition, overlay updates, and user interactions, ensuring real-time responsiveness. |
| Optimize loading | Performance | Monitors loading times and memory usage when switching buildings or multimedia assets. |
| Get directional guidance | Integration / E2E | Simulates navigation flow to verify that directional guidance is accurate and updates correctly in AR view. |
| Locate points of interest | Integration | Verifies that POI markers are correctly positioned relative to user location and building coordinates. |
| Display dynamic content | Integration / Unit | Ensures dynamic data (services, images, text) updates in response to backend changes. |
| Use profiles | Unit / Integration | Tests user profile storage and retrieval, ensuring that preferences are applied consistently. |
| Save preferences | Unit | Validates the saving and loading of user preferences (language, AR mode, etc.) in local storage. |
| Manage content | Integration | Confirms admin or content manager operations update building and history data correctly. |
| Use common multimedia formats | Unit / Integration | Tests compatibility with common formats (.mp4, .jpg, .png) to ensure consistent playback and display. |
| View external resources | Integration | Validates correct loading of external links (e.g., maps, documents) inside the Unity WebView. |
| Open links in-app | E2E | Ensures external links open within the app environment without breaking user context. |
| Use app as a guest | E2E | Simulates a complete user flow without login, ensuring all guest-accessible features work correctly. |