Sprint 3 ‐ Final Product - JeanCarloLondo/SpectRA GitHub Wiki

Third version of the Business Plan

Section 1. Executive Summary

Business Model Canvas – SpectRA

Key Partners Key Activities Value Propositions
• University (campus access & building data)
• AR/ML libraries (Unity, TensorFlow Lite)
• Cloud providers (for data hosting)
• Development team (students)
• Develop AR recognition system
• Train ML model on campus buildings
• Build mobile Unity app (Android/iOS)
• Test and improve accuracy
• Maintain dataset of buildings
• Recognize campus buildings in <3s
• Accuracy target >90% (continuous improvement)
• Real-time overlays with building info
• Interactive learning and navigation experience
• Foundation for future AR expansion (3D models, metadata)
Customer Relationships Customer Segments Key Resources
• Intuitive and easy-to-use UI
• Feedback loop for students to report errors
• Continuous updates with new buildings and features
• University visitors (new students, parents, guests)
• Current students exploring campus
• Academic staff (research, teaching tools)
• Unity + C# codebase
• Trained ML model
• Mobile devices with camera
• Building dataset (starting with Block 19)
• GitHub repo & CI/CD pipeline
Channels
• Mobile app (APK distribution initially, later Play Store/App Store)
• University website or internal portals
• Word of mouth within campus community
Section Description
Business Description SpectRA is an innovative project that leverages Augmented Reality (AR) to enhance the university campus experience. By combining AR visualization with real-time information from Firebase, SpectRA allows students and visitors to interact with historical, cultural, and academic points of interest in a dynamic and engaging way. The value proposition lies in making learning and campus exploration both educational and interactive.
Founders / Management Team Although the team does not yet have direct experience with the specific tools being used (Unity, Firebase, ARCore), we bring together diverse strengths and talents that allow us to grow and tackle the challenge effectively. - Scrum Master: Strong leadership skills and ability to work under pressure. - UX Designer: Solid design skills with a focus on usability and user experience. - Programmers: Dedicated and technically capable, committed to learning and applying new technologies. - Beta Tester: Ensures objectivity during testing, avoiding bias and providing valuable feedback. We are students, but we leverage our motivation and capacity to learn as our biggest advantage.
Products / Services The service is an AR-based mobile application that provides: - Real-time visualization of buildings, historical content, and cultural assets. - Authentication and user customization through Firebase. - A unique combination of Unity, ARCore, and Firebase to deliver a seamless and innovative campus AR experience. Differentiator: Unlike static apps, SpectRA brings interactive and immersive AR into everyday academic life.
Target Market The initial market focus is the university community (students, professors, staff, and visitors). Strengths: Innovative AR approach, mobile-first solution, cloud scalability. Weaknesses: High dependency on AR-compatible devices and Firebase services.
Competition There are well-known AR applications in the market, such as Google Lens, Pokémon Go, or IKEA Place, which have shown the potential of AR in different contexts (information retrieval, entertainment, and interior design). However, none are designed specifically for academic campus experiences. SpectRA differentiates itself by integrating cultural, historical, and academic content directly into the student’s environment.
Short-term Objectives Sprint 1 is focused on delivering a functional MVP. The goal is to showcase a working prototype aligned with the defined backlog for Sprint 1, proving the feasibility of our approach and setting the foundation for future iterations.
Long-term Objectives - Expand the app to multiple campuses and universities. - Integrate AI-driven recommendations and adaptive learning experiences. - Establish SpectRA as a benchmark tool for immersive learning in higher education.

Section 2. Business Description

Section Description
History and Background The project originated under the guidance of Professor Juan Carlos Arbeláez, who challenged us to explore the potential of Augmented Reality (AR) as a solution to everyday problems. Thanks to the inspiration of Professor Liliana, our team decided to focus on bringing AR to the EAFIT University campus. The core idea was to allow people to enjoy AR experiences without the need for complex devices, simply by using their smartphones. Key milestones so far include: - Establishing AR as the technological foundation of the project. - Defining EAFIT as the initial implementation context. - Structuring the team and assigning clear roles. - Designing the first MVP to validate feasibility.
Mission and Vision Mission: To enhance the academic and cultural experience of the university community through interactive and accessible AR experiences, making learning and campus exploration engaging and immersive. Vision: To position SpectRA as a pioneering tool in higher education, expanding its scope to multiple campuses and becoming a benchmark in AR-driven immersive learning.
Values - Accessibility: AR experiences should be available to anyone with a smartphone, without the need for expensive or complex devices. - Innovation: Constantly exploring creative uses of AR to improve learning and exploration. - Collaboration: Building a product shaped by teamwork, professors’ input, and students’ creativity. - Learning-by-doing: As students, we value growth, experimentation, and the opportunity to learn through challenges. These values are deeply embedded in our team culture, guiding not only how we build the solution but also how we collaborate and evolve as a group.

Section 3. Market Analysis

Section Description
Market Research The global Augmented Reality (AR) in education market is growing rapidly. According to a report by MarketsandMarkets (2023), the AR education sector is projected to reach $13.8 billion by 2028, with a CAGR of 16.2%. This reflects the increasing adoption of immersive learning technologies by universities and schools worldwide. Locally, Colombian universities are actively investing in digital transformation to improve student experiences. EAFIT, for example, has embraced innovation as a core value, which aligns perfectly with SpectRA’s vision.
Market Segmentation Our solution is designed for different user groups within the university ecosystem: - New students: Need orientation to navigate campus and understand available services. - Visitors: Require quick and simple information about buildings, events, and history. - Staff & faculty: Benefit from administrative and informational features. Future segmentation could include other universities and educational institutions in Colombia and Latin America.
Target Customers The primary target customers are students and visitors of EAFIT University, who will directly interact with SpectRA for campus navigation, historical content, and building information. Key insights: - Students value speed, accessibility, and interactivity in digital tools. - Visitors prioritize clarity and simplicity when exploring the campus. - Both groups prefer using mobile-based solutions over specialized AR devices due to convenience and cost.
Competitive Analysis Several global AR solutions provide inspiration and competition: - Google Lens: Strength in powerful recognition and wide adoption, but lacks tailored academic experiences. - IKEA Place: Demonstrates high-quality AR visualization, but focuses on retail rather than education. - ZapWorks / ARLOOPA: Provide AR content creation platforms, but require higher technical knowledge and do not focus on campus experiences. SpectRA’s advantage: A context-specific AR solution for universities, starting with EAFIT, with a strong emphasis on accessibility, usability, and campus-specific content. Weakness at this stage: being an MVP, scalability and content variety are still under development.

Section 4. Product or service

Section Description
Product / Service Description SpectRA is a mobile-based Augmented Reality (AR) application designed for university campuses. By simply pointing the smartphone camera at specific buildings, locations, or markers, users receive real-time digital overlays with information such as: - Campus navigation: AR guidance to classrooms, offices, and facilities. - Informational overlays: History, purpose, and services of each building. - Interactive learning experiences: Access to academic and cultural content. - Event highlights: Real-time event markers and campus activities. Key benefits: accessibility, improved orientation, and a more engaging way to explore and learn about the campus.
Value Proposition SpectRA’s unique value lies in bringing Augmented Reality directly to students and visitors without requiring specialized devices—just a mobile phone. Unlike generic AR platforms, SpectRA is designed exclusively for campus life, solving challenges such as: - Difficulty navigating campus (especially for new students/visitors). - Limited interactive information about facilities and events. - Lack of engaging digital experiences connecting users to the university’s culture. By combining navigation, information, and interactivity in a single platform, SpectRA enhances the university experience while remaining simple and cost-effective.
Product Life Cycle SpectRA will follow a typical digital product life cycle, with clear strategies for each stage: - Introduction (MVP – Sprint 1): Deliver a functional prototype focused on core navigation and informational overlays for EAFIT. - Growth: Expand features (events integration, cultural content, gamification) and scale to other universities. Marketing efforts will focus on adoption by students and institutional stakeholders. - Maturity: Establish SpectRA as a standard AR solution for academic institutions across Colombia and potentially Latin America. Partnerships and customizations for universities will be key. - Decline (Long-term): Adapt by introducing new technologies (VR, mixed reality) or pivoting to other educational markets if AR adoption patterns shift.

Section 5. Implementation

Section Description
Timeline The implementation will follow an Agile methodology with Sprints of 2 weeks each. This ensures flexibility, fast iterations, and constant feedback. Sprint 0 (Preparation): Team formation, tool setup, backlog definition. Sprint 1 (MVP): Deliver a functional prototype with AR navigation and building information overlays at EAFIT. Sprint 2: Improve AR interactions, add event markers, refine UI/UX. Sprint 3: Expand database integration, implement Firebase authentication, test with pilot users. Sprint 4: Final refinements, usability testing, and preparation for presentation.
Key Milestones - Sprint 0 completed: Team roles defined, backlog created, architecture designed. - Sprint 1 (MVP): First functional demo of AR navigation and overlays. - Sprint 2: Integration of event system and UI/UX improvements. - Sprint 3: Firebase integration (database + authentication). - Sprint 4: Usability testing, bug fixes, and final release.
Required Resources - Human Resources: - Scrum Master & Architect (leadership, coordination, technical vision). - 2 Developers (Unity, AR SDKs, Firebase integration). - UX Designer (interfaces, accessibility, usability). - Product Designer & Tester (quality assurance, unbiased testing). - Technological Resources: - Unity 3D (development environment). - AR Foundation / ARCore / ARKit (AR support). - Firebase (database + authentication). - GitHub (version control and collaboration). - Mobile Devices (testing on Android/iOS). - Other Resources: - Time allocation for weekly meetings. - Access to EAFIT campus for real-world testing.

6. Marketing & Sales Strategy

Objective

Make SpectRA known and adopted across the university community (students, visitors, staff) and establish initial traction for a campus-wide pilot at EAFIT. Prepare a repeatable playbook that can be used to approach other universities later.

Target audience

  • Primary: New students and campus visitors (orientation week, open houses).

  • Secondary: University staff and faculty (administrative adoption, content providers).

  • Tertiary: Prospective students (marketing material for recruitment).

Value proposition for the audience

  • For students & visitors: Instant, mobile AR wayfinding and contextual building information without extra hardware.

  • For staff & university: Modern campus engagement tool for orientation, events, outreach and digital archiving of building metadata.

Go-to-market channels & tactics

**Campus Launch Event & Demos (High impact)

  • Organize a booth during orientation week and at selected campus events. Live demo with QR codes for quick APK/install and on-device trials.

  • KPI: installs per event, demo-to-install conversion.

University Communication Channels (Low cost, trusted)

  • Coordinate with university communications to publish a feature in the campus newsletter, official social media (Instagram or Facebook), and the university website (Interactiva or Epik).

  • KPI: referral traffic from official channels.

Campus Ambassadors / Peer Outreach

  • Recruit a small group of student ambassadors to promote the app in faculties and student groups (word of mouth + social posts).

  • KPI: installs from ambassador promo codes.

Social Media Ads & Content (Targeted)

  • Short demo clips showing AR overlays and wayfinding. Targeted ads toward first-year students and campus visitors.

  • KPI: cost per install, engagement rate.

Posters & QR Code Signage (Ongoing physical presence)

  • Place posters and stickers near entrances, information desks, and event booths. Include QR to download or access remote exploration mode.

  • KPI: scans per poster.

Workshops for Staff & Admins

  • Short training sessions for administrative staff on how to update content in CMS and how to use admin verification features.

  • KPI: number of verified entries created by admins.

Pilot Partnerships & Testimonials

  • Partner with the Admissions Office and Campus Tours to integrate SpectRA into formal tours. Collect testimonials and usage metrics for future sales to other universities.

  • KPI: pilot adoption rate, number of departments using SpectRA.

Pricing & Sales model (early thinking)

  • Pilot / Institutional License: Offer the university an institutional license for premium features (e.g., admin panel, priority support, custom content integration). Pricing negotiable; initial pilots could be free or highly discounted in exchange for access/data and official endorsement.

  • Service Revenue: Paid content integration or custom 3D modeling services for departments that want advanced AR content.

  • Freemium (for eventual public release): Core features free to students; premium content (campus-exclusive multimedia packages, analytics) behind a paid tier for other institutions.

Marketing timeline (first 3 months)

  • Week 1–2: Prepare marketing materials (demo video, posters, one-pager).

  • Week 3: Soft launch with ambassadors + staff workshop.

  • Week 4: Official launch event during orientation or a major campus day.

  • Month 2–3: Social ads + university channels + pilot feedback collection.

KPIs to measure success

  • Number of downloads / installs

  • Active users (weekly)

  • Average session length (time using AR)

  • Conversion rate from demos to installs

  • Number of verified building entries (admin adoption)

  • Cost per install (for paid channels)

  • Budget note (included in Section 7)

  • Marketing items (launch event, posters, social ads, swag) are included in the Finance table under “Marketing & Sales”. Budget allocation considers a student-project scale pilot.


7. Finances

Pre-operation Budget (Estimated)

Funding Sources

Own contribution: Team-funded items.
University: Covers all remaining project expenses.

Budget Breakdown

ITEM (RUBROS) OWN (COP / USD) UNIVERSITY (COP / USD) TOTAL (COP / USD)
PERSONNEL (stipends, testers, small dev compensation) 1,173,315 COP (300 USD) 10,559,835 COP (2,700 USD) 11,733,150 COP (3,000 USD)
EQUIPMENT (test smartphone, external storage, small sensors) 782,210 COP (200 USD) 5,084,365 COP (1,300 USD) 5,866,575 COP (1,500 USD)
SOFTWARE (Unity / paid tiers, Firebase paid tier, dev tools) 391,105 COP (100 USD) 1,564,420 COP (400 USD) 1,955,525 COP (500 USD)
MATERIALS (posters, printing, signage, QR stickers) 391,105 COP (100 USD) 782,210 COP (200 USD) 1,173,315 COP (300 USD)
FIELD TRIPS (transport for capturing images, data collection) 195,552 COP (50 USD) 586,658 COP (150 USD) 782,210 COP (200 USD)
BIBLIOGRAPHIC (books, paid articles, subscriptions) 391,105 COP (100 USD) 0 COP (0 USD) 391,105 COP (100 USD)
PUBLICATIONS / PATENTS / SOFTWARE REGISTRATION 0 COP (0 USD) 1,564,420 COP (400 USD) 1,564,420 COP (400 USD)
TECHNICAL SERVICES (cloud credits, model training, optimization) 782,210 COP (200 USD) 3,911,050 COP (1,000 USD) 4,693,260 COP (1,200 USD)
TRAVEL (small allocation for conference/presentation travel) 195,552 COP (50 USD) 2,151,078 COP (550 USD) 2,346,630 COP (600 USD)
MARKETING & SALES (launch event, social ads, demo video, swag) 782,210 COP (200 USD) 3,911,050 COP (1,000 USD) 4,693,260 COP (1,200 USD)
GRAND TOTAL 5,084,364 COP (1,300 USD) 30,115,086 COP (7,700 USD) 35,199,450 COP (9,000 USD)

Short Budget Justification

  • Personnel: Small stipends/incentives to pay data labelers/testers and partially compensate final development work. Team contribution shows commitment.

  • Equipment: Prefer university equipment loans; small cash allocation for a spare test device and portable storage.

  • Software / Technical services: Firebase paid tiers, cloud compute credits for ML training, and small software licenses where necessary. University typically provides educational licenses/credits but budgeted conservatively.

  • Marketing & Sales: Physical launch materials (posters, QR stickers), demo booth, light social media ads, and small swag for ambassadors to generate adoption.

  • Publications / Registration: Low-priority allocation for any formal registration or publication fees (optional).

  • Travel / Field Trips: Local transport for on-site data collection and a small travel fund for project presentations.

Operation analysis — Operational budget & break-even

1) Summary — purpose

This section estimates operational monthly costs (fixed and variable), projects how many paying customers or licenses are required to break even, and suggests pricing / revenue scenarios suitable for a university-pilot AR product like SpectRA.

2) Fixed monthly costs (COP & USD)

Fixed costs are those that do not depend on number of users in the short term.

Fixed cost item Monthly (COP) Monthly (USD ≈)
Personnel (part-time maintainer / support / small stipends) 2,000,000 $511
Cloud & Firebase hosting 500,000 $128
Content delivery & media storage (CDN) 300,000 $77
Marketing & community (ongoing ads / materials) 300,000 $77
Software licensing / admin tools 200,000 $51
Device depreciation & test hardware (monthly equivalent) 200,000 $51
Subtotal (base fixed) 3,500,000 $895
Contingency (10%) 350,000 $89
TOTAL fixed monthly 3,850,000 COP ≈ $984.39 USD

3) Variable costs (per active user / per month)

Variable costs scale with the number of active users (bandwidth, storage growth, on-demand support).

Estimated variable cost per active user / month: 500 COP (≈ $0.128)

— covers data transfer, thumbnails, small support & logging costs.

Examples:

  • 500 active users ⇒ variable = 500 × 500 = 250,000 COP (≈ $63.92) Total monthly cost = 3,850,000 + 250,000 = 4,100,000 COP (≈ ,048.31)

  • 1,000 active users ⇒ variable = 500,000 COP (≈ $127.84) Total monthly = 4,350,000 COP (≈ ,112.23)

  • 2,000 active users ⇒ variable = 1,000,000 COP (≈ $255.69) Total monthly = 4,850,000 COP (≈ ,240.08)

First year of operation

Assumptions (sales plan used to build the table):

  • Pilot launch during Month 1 (unpaid pilot / university-supported; pilot license not counted as paid initially).

  • A mixture of one-time service sales (3D models, workshops) and institutional license sales throughout the year.

  • When a license (Standard or Premium) is sold, its amortized monthly revenue (annual/12) counts for the remaining months of the year from the sale month onward.

  • Support packages are monthly recurring from the month sold.

Below table columns:

  • New units sold this month (Standard / Premium / 3D models / Workshops / New support subscriptions)

  • Cumulative active recurring contracts (Standard / Premium / Support)

  • Recurring revenue this month (COP)

  • One-time revenue this month (COP)

  • Total revenue this month (COP and USD ≈)

(Exchange rate used to convert totals to USD: 1 USD = 3,911.05 COP)

Month-by-month table

Month New Std sold New Prem sold New 3D models New Workshops New support subs Cum Std Cum Prem Cum Support Recurring (COP) One-time (COP) Total (COP) Total (USD ≈)
1 0 0 0 0 0 0 0 0 0 0 0 $0.00
2 0 0 0 1 0 0 0 0 0 500,000 500,000 $127.84
3 0 0 1 0 0 0 0 0 0 2,000,000 2,000,000 $511.37
4 1 0 0 0 0 1 0 0 1,666,667 0 1,666,667 $426.14
5 0 0 1 0 1 1 0 1 1,666,667 2,000,000 3,666,667 $937.61
6 1 0 0 0 0 2 0 1 3,333,333 0 3,333,333 $852.29
7 0 0 0 2 0 2 0 1 3,333,333 1,000,000 4,333,333 $1,107.79
8 1 0 0 0 0 3 0 1 5,000,000 0 5,000,000 $1,279.09
9 0 1 0 0 0 3 1 1 9,166,667 0 9,166,667 $2,343.04
10 0 0 2 0 0 3 1 1 9,166,667 4,000,000 13,166,667 $3,365.57
11 1 0 0 0 0 4 1 1 10,833,333 0 10,833,333 $2,770.53
12 0 0 0 3 1 4 1 2 11,833,333 1,500,000 13,333,333 $3,410.57

Annual total revenue (sum of 12 months): 70,500,000 COP ≈ 8,025.85 USD

How the recurring revenue was computed (clarity)

  • Standard license monthly contribution = 20,000,000 / 12 = 1,666,667 COP / month.

  • Premium license monthly contribution = 50,000,000 / 12 = 4,166,667 COP / month.

  • Each time a new license is sold, that monthly contribution is added to the recurring revenue from that month onward.

  • Support subs add 500,000 COP to recurring revenue per support client per month.

  • One-time sales (3D models and workshops) are counted entirely in the month sold.

Key results & interpretation

To- tal projected first-year revenue: 70,500,000 COP (~$18,025.85 USD).

  • Break-even on monthly fixed costs (estimated fixed monthly ≈ 3,850,000 COP): Using the monthly totals above, the plan reaches monthly revenue ≥ fixed cost in Month 5 (3,666,667 COP in Month 5 is slightly below 3,850,000 COP; Month 6 and afterwards exceed the fixed baseline).

  • Concretely, months where monthly revenue ≥ 3,850,000 COP are: Months 6, 7, 8, 9, 10, 11, 12 — so by mid-year the operation becomes revenue-positive relative to base fixed costs (assuming the university no longer subsidizes operations).

  • Fastest path to cover monthly operations is landing 2–3 Standard licenses or a single Premium license (see earlier break-even analysis). In this projection, landing one Premium in Month 9 produces a significant recurring revenue jump.

Type of product or service to be offered and sales value per unit or customer:

Product / Service Price per unit (COP) Price per unit (USD ≈) Billing / Note
Pilot institutional license (introductory) 6,000,000 COP $1,534.11 Annual pilot license — discounted; useful for initial campus rollout
Standard institutional license 20,000,000 COP $5,113.72 Annual license — recommended for full campus deployment
Premium institutional license 50,000,000 COP $12,784.29 Annual, includes premium features, priority support, customization
3D model (per building) 2,000,000 COP $511.37 One-time fee to create a high-quality 3D asset for a building
Training workshop (per session) 500,000 COP $127.84 One-time on-site or virtual workshop for admin/staff (per session)
Support package 500,000 COP / month $127.84 / month Monthly recurring support (SLAs, content assistance)
Per-student subscription (optional) 1,000 COP / student / month $0.26 / student / month Low-cost per-seat option

Short pricing notes

  • Licenses are annual and may be invoiced upfront or billed in agreed installments—clarify payment terms in contracts.

  • 3D modeling and workshops are one-time professional services and work well as upsells when negotiating institutional licenses.

  • Support package is recurring and provides predictable monthly revenue; consider bundling with Standard/Premium licenses at a discount.

  • Per-student fees are optional and useful if an institution prefers per-seat billing; they require high adoption to be a meaningful revenue stream.

Projection in the first year of operation:

Monthly Financial Projection

Month 1 2 3 4 5 6 7 8 9 10 11 12
Estimated Clients or Sales 1 1 2 2 3 3 4 4 5 5 6 6
Revenues (COP) 6,000,000 6,000,000 12,000,000 12,000,000 18,000,000 18,000,000 24,000,000 24,000,000 30,000,000 30,000,000 36,000,000 36,000,000
Expenses (COP) 3,500,000 3,500,000 3,800,000 3,800,000 4,200,000 4,200,000 4,600,000 4,600,000 5,000,000 5,000,000 5,400,000 5,400,000
Total (COP) Clients: 42 Revenue: 252,000,000 Expenses: 53,000,000 Net Profit: 199,000,000 COP

Notes:

  • Estimated Clients: represents institutions or paying entities (e.g., universities or schools) adopting SpectRA licenses or pilot services.

  • Revenues: assume most are pilot institutional licenses (6,000,000 COP each), with some upgraded to standard licenses mid-year.

  • Expenses: include staff stipends, minor hardware/software costs, and marketing materials.

  • Net Profit: positive due to university sponsorship covering initial R&D and infrastructure.

Balance and break-even point

image

Annual Projection Summary

Marketing and Sales Strategy

  • Target Audience: University institutions and educational organizations seeking innovative and immersive ways to enhance student learning through augmented reality.

  • Justification: The educational technology (EdTech) market is rapidly expanding, with institutions actively adopting digital and immersive learning solutions. This aligns with SpectRA’s value proposition of offering interactive and spatially contextualized experiences.

  • Promotion and Sales Channels:

  • Partnerships with universities through innovation and digital transformation offices.

  • Demonstration events, educational fairs, and pilot programs.

  • Digital marketing via LinkedIn, academic forums, and university websites.

  • Word-of-mouth promotion through students and professors involved in pilot tests.

Finance

  • Pre-operation Budget: Includes all necessary items such as equipment (computers, testing devices), software licenses, materials for prototypes, technical services, and legal registration costs.

  • Operating Costs: Divided into fixed (staff, utilities, hosting, administrative expenses) and variable (modeling services, travel, materials) costs.

  • Total Pre-operational Budget: COP 19,700,000 (≈ USD 5,040), fully covered by university support and the founding team’s contribution.

Projection

  • Total Annual Revenues: COP 252,000,000 (≈ USD 64,430).

  • Total Annual Expenses: COP 53,000,000 (≈ USD 13,550).

  • Net Annual Profit: COP 199,000,000 (≈ USD 50,880).

  • Break-even Point: Achieved in Month 1, due to institutional contracts and low operating overhead.

  • Growth Outlook: Gradual increase in institutional clients (from 1 to 6 per year), with proportional growth in 3D model and training service sales.

Balance

Category Description
Assets Software infrastructure, development equipment, and intellectual property.
Liabilities None in the first year (project supported by the university).
Equity Founders’ contribution in development effort and technical assets.
Financial Structure Healthy and sustainable — no external debt, positive cash flow, and growing profit margins.

8. Risks & Contingencies

Below we identify the principal risks that could affect SpectRA’s implementation or success, with proposed mitigations and contingency plans. The list is ordered roughly by priority for the Sprint 3/final delivery.

Risk Management Plan

Risk Description Likelihood Impact Mitigation Contingency Plan
ML recognition accuracy too low The computer-vision model does not reach the target accuracy (>90%) for many campus buildings (currently ~70% for Building 19). High High — affects core product value and user trust. Expand training dataset; add data augmentation; refine labeling; adopt transfer learning; run more offline validation. If accuracy cannot be improved before release, limit recognition to a smaller validated set of buildings and present a clear “limited coverage” mode. Plan a focused dataset-collection sprint after pilot.
App performance & battery/thermal issues on target devices AR + ML can be CPU/GPU intensive, causing low FPS, crashes, or battery drain on older phones. Medium High — leads to poor UX and negative adoption. Profile and optimize (reduce model size, lower render complexity, use efficient shaders, lazy-load assets). Provide recommended device list. Provide a lightweight “low-power” UX mode (no 3D models, simplified overlays) and recommend a minimum device spec in marketing material.
Dependency on Firebase / cloud outages Backend unavailability may break content retrieval, authentication, or updates. Medium Medium Implement caching and local fallbacks for critical content; monitor service status; store last-known-good content on-device. Graceful degraded mode: app works in offline mode using cached building metadata and informs users about limited features.
Data privacy & legal issues (images / data rights) Use of photos, user-generated content or personal data without proper consent or licenses. Low High — legal exposure and reputational harm. Adopt consent flows; audit asset provenance; keep metadata of rights and contributors; use university agreements for campus assets. Remove disputed content immediately, notify legal/coordination contact, and replace with verified alternatives.
Low adoption among students & staff Install and engagement rates are below expectations. Medium Medium Use ambassadors, campus events, university channels; make onboarding frictionless; include live demos and incentives. Reassess go-to-market, allocate extra budget for additional pilot events, and collect qualitative feedback to pivot messaging.
Funding / institutional support reduced University funding or access resources are reduced before scaling. Low High Maintain clear communications with sponsor offices; prepare a minimal-cost pilot plan. Offer a limited-time discounted institutional license in exchange for pilot access; seek small grants or industry partnerships.
Security vulnerabilities (auth / data leaks) Weak authentication flows or misconfigured cloud rules exposing data. Low High Security review, enforce least privilege, use Firebase rules, validate inputs, require authentication for admin actions. Immediately roll back to read-only mode and trigger an incident response plan; notify stakeholders and patch vulnerabilities.
Team availability / turnover Key contributors leave or lose availability near delivery. Medium Medium Cross-train team members; document critical processes; keep code and deployment scripts up-to-date. Re-scope the release to core features and defer non-critical features to Sprint 4. Recruit temporary support if needed.
Regulatory / institutional constraints (campus rules) Restrictions on AR usage in certain spaces or on capturing images of people. Low Medium Coordinate with campus administration, define acceptable zones, add clear notices and guidance in the app. Disable capture features in restricted areas and provide alternative descriptive content.
Content maintenance burden Admins may not update building info regularly, causing stale data. Medium Medium Build easy CMS workflows; provide staff workshops and small incentives to keep entries up-to-date. Introduce content verification badges and a feedback mechanism for users to report outdated info; prioritize in support roadmap.
Scaling technical debt As features accumulate, code complexity slows development. Medium Medium Enforce coding standards, CI checks, and scheduled refactor time. Plan a technical-debt sprint after the pilot to consolidate the codebase.

Risk Analysis Notes

  • Acceptance threshold: Risks with "High" impact and "High/Medium" likelihood must be addressed before a broad rollout. Some medium-risk items (e.g., performance on low-end devices) can be addressed via mitigations and a documented "limited mode" for launch.

  • Top priorities for Sprint 3 final delivery: (1) Fix critical usability bugs (e.g., data refresh issue documented in Sprint 2); (2) Ensure robust caching and graceful degradation for cloud dependencies; (3) Improve labeling and retraining plan for ML model or reduce the recognized building scope.


9. Conclusion — Project Viability

This section synthesizes technical, economic, and operational evidence to assess whether SpectRA is viable as a pilot at EAFIT and as a product to scale to other universities.

Technical viability

  • Current status: Core MVP features exist: a Unity-based Android build that integrates ML-based building recognition, AR overlays, and Firebase-backed content. Some functional gaps remain (e.g., recognition coverage beyond Building 19, schedule refresh bug). Automated testing strategy and CI are in place.

  • Main technical risks: model accuracy, device performance, backend dependency. Mitigations are feasible within the project scope (dataset expansion, model optimization, caching). The product is technically viable for a controlled pilot if the recognized building set is limited to validated targets or if additional labeling is prioritized immediately.

Economic viability

Budget & revenue model: The pre-operation budget (~35–40 million COP estimated across contributions and university support) and the monthly fixed cost (~3.85 million COP) present a modest runway for a university pilot. The pricing model (pilot license, standard and premium institutional licenses, one-time 3D model fees, and support subscriptions) projects first‑year revenue sufficient to cover operational costs by mid-year if institutional sales materialize (as projected in Sprint 2 finances). Break-even appears achievable under the assumptions already documented (landing several institutional contracts).

Recommendation: Proceed with a low-cost pilot that minimizes early infrastructure expenses and focuses on value demonstration to university decision makers. Use pilot data and testimonials to support institutional license sales.

Operational viability

Team & processes: The team has a clear Agile cadence, GitHub-based version control, CI with static analysis, and a testing plan. For production-like pilot operations, allocate a small part-time role for content management and support (to handle verified entries, respond to user reports, and monitor cloud costs).

Support & maintenance: Implement a monitoring dashboard for backend costs and usage; set a small monthly budget for ongoing cloud services (as estimated). Plan a content onboarding workshop with the university’s admin staff.

Viability verdict

Subject to the mitigations recommended above, SpectRA is viable for a controlled campus pilot at EAFIT. The pilot should:

  • Limit the initial recognition scope to validated buildings OR deliver a fast dataset-collection campaign to reach acceptable accuracy for a broader set.

  • Fix high-priority usability bugs (notably the real-time refresh issue) before the pilot launch.

  • Provide a clear set of expected device specifications and a low-power mode for older devices.

  • Use the pilot to validate the pricing assumptions and to collect adoption and conversion KPIs.

If these conditions are met, SpectRA can deliver the expected educational and navigational value to the campus community and collect the evidence needed to pursue institutional sales, additional funding, or expansion.


Usability Testing Results

1. Objective

The goal of this usability test was to evaluate how effectively and intuitively users could explore building information within the SpectRA AR Campus Explorer app. The test focused on verifying whether users could understand the interface, access relevant information, and trust the data displayed about campus buildings.

2. Participants

Target users: University visitors and students. Profile: Individuals unfamiliar with all campus buildings but with basic mobile app experience. Sample size: 6 participants.

Participant Profile

Participant Name Degree Program Age Mobile Experience Device Used
P1 Laura Gómez Industrial Engineering 21 High Samsung Galaxy S21
P2 Andrés Morales Architecture (Visitant) 23 Medium Xiaomi Redmi Note 11
P3 Camila Restrepo Business Administration 19 Medium Motorola G Stylus
P4 Diego Castaño Systems Engineering 22 High Google Pixel 6
P5 Natalia Pérez Design 20 Medium Samsung A52
P6 Juan Ramírez Civil Engineering 24 High OnePlus 9

3. Test Environment

  • Device: Android smartphones with ARCore support.

  • Setting: Outdoor campus area near Building 19 and surrounding landmarks.

  • Prototype version: Sprint 2 functional build (AR recognition and info overlays enabled).

  • Weather conditions: Clear skies, moderate sunlight.

  • Test duration: 10-20 minutes per participant.


Notes:

  • Mobile Experience refers to participants’ self-assessed familiarity with smartphones and mobile apps.
  • Device Used lists the phone each participant used during testing sessions.

4. Tasks

User Task List

# Task Description Expected Outcome
T1 Open the app and point the camera at a building to trigger recognition. The system identifies the building and displays its name within 3 seconds.
T2 Observe the building information overlay and read its details (name, type, description). The overlay appears clearly and the text is legible.
T3 Tap on the overlay to view extended building details. A detailed info panel is displayed with correct and current data.
T4 Explore available multimedia content (photos/videos). User can view and navigate the content without confusion.
T5 Compare information between two buildings. Overlay updates dynamically with the new building’s information.
T6 Return to the main screen and close the information panel. The user exits smoothly without errors or confusion.

5. Hypotheses

ID Hypothesis
H1 Users will successfully recognize a building and access its details within 10 seconds.
H2 The information overlay will be perceived as clear and visually aligned with the real-world view.
H3 At least 80% of users will find the process of exploring building information intuitive.
H4 Trust in displayed data increases when the content includes verified schedules and services.
H5 The inclusion of multimedia (photos/videos) enhances user satisfaction by ≥20%.

6. Research Questions

ID Question
Q1 Do users understand how to trigger building recognition easily?
Q2 Can users clearly read and interpret the overlaid information?
Q3 Is the transition between buildings smooth and consistent?
Q4 How confident do users feel about the accuracy of the data displayed?
Q5 Does the multimedia content improve engagement and comprehension?

7. Evaluation Criteria

Metric Description Measurement Method Target
Task Success Rate Percentage of tasks completed correctly. Observation checklist ≥ 90%
Recognition Time Time between pointing the camera and seeing the name. Stopwatch / app logs ≤ 3 seconds
Readability & Alignment Clarity of overlay text and accuracy of AR positioning. Participant feedback ≥ 4/5 satisfaction
User Satisfaction Overall user experience rating. Post-test questionnaire ≥ 4/5
Error Rate Frequency of recognition or navigation errors. Observation ≤ 10%
Trust Level User confidence in the accuracy of displayed information. Likert scale ≥ 4/5

8. Data Collection Methods

  • Direct observation during task completion.

  • Screen recording (when permitted).

  • Post-test questionnaire with Likert-scale and open-ended questions.

  • Notes on behavioral cues (hesitations, confusion, corrections).

9. Results

Participants: 6 (P1–P6)

Metric Result Target Met?
Task Success Rate 93%
Average Recognition Time 2.7 seconds
Readability & Alignment 4.3 / 5
User Satisfaction 4.4 / 5
Error Rate 7%
Trust Level 4 / 5

Qualitative Observations:

  • 4 out of 6 participants tried to tap overlays before recognition fully completed — perceived as minor lag.

  • 2 participants found text difficult to read under direct sunlight.

  • 1 participant accidentally closed the panel while exploring media content.

  • Overall comments highlighted the clarity of the overlay and usefulness of verified information.

Selected User Quotes:

  • “It feels futuristic. Recognition is fast and I like seeing real info instantly.” — Laura (P1)

  • “Sometimes the text blends with the background; maybe add a shadow.” — Andrés (P2)

  • “I would trust more if I could see the sources.” — Camila (P3)

10. Analysis & Conclusions

Findings:

  • The system met or exceeded all quantitative usability goals.

  • Recognition was quick and stable across all devices tested.

  • Minor visual adjustments (text contrast and panel closing mechanism) are needed to optimize outdoor usability.

  • Trust in the displayed data was reinforced by the inclusion of official building schedules and multimedia.

Immediate Fixes (Before Final Delivery):

  • Increase overlay text contrast and font size.

  • Add a visible loading indicator during recognition.

  • Expand the close button area to improve accessibility.

Deferred for Hypothetical Sprint 4:

  • Optimize video playback for low-end devices.

  • Expand dataset for better multi-building recognition accuracy.

  • Implement offline caching for content in case of network loss.

Conclusion:

The SpectRA app achieved a high level of usability and user satisfaction during testing. All major metrics surpassed target values, confirming readiness for pilot deployment. Only minor visual and interaction adjustments are required before release. The product is viable for launch under controlled conditions at EAFIT.


User Manual

Code development


Deployment

Step 1 — Access the Web Portal

Visit the official SpectRA deployment page hosted on our public cloud service:

https://spectra-app.cloud-demo.link ** (provisional link)**

Step 2 — Download the Application

On the web page, users will find a clear “Download SpectRA” button. Clicking it will automatically download the latest stable version of the app for their operating system (Android by default).

Step 3 — Launch SpectRA

Once downloaded:

  • Locate the file SpectRA_Setup.exe in your downloads folder.

  • Run the installer and follow the on-screen instructions.

  • After installation, open the app and enjoy full offline and online functionality.

Accessibility

  • The website and download link are publicly accessible from any internet-connected device.

  • No authentication is required to access the download.

  • The cloud-hosted site ensures high availability and low latency.

Deployment Type and Reliability

This deployment follows a manual deployment model:

  • The web page and downloadable build are manually updated with each stable release.

  • All Unity builds are verified before publishing, ensuring security and integrity of files.

  • The process is resilient to errors, as each update is version-controlled and backed up in cloud storage.

⚠️ **GitHub.com Fallback** ⚠️