06. Testing Plan Report - khalillabban/Snorting-Code GitHub Wiki
For the purposes of testing, Jest, SonarQube, CodeCov, and Maestro are used. There are two separate, relevant automatic workflows, one is the build workflow, and the other is the continuous integration workflow.
Unit testing is conducted in both workflows, using Jest. Unit tests are written by each member for their own additions, ensuring that the tests are specifically and expertly targeted to confirm the correct behaviour. Unit tests are constantly run during both continuous integration and the actual build process. The main units are the user interface which encompasses both display and interaction with the map, the integration with Google Calendar, and the individual routing units which cover the routing from an external destination to a campus building, and the routing through the inside of the campus building.
After the Jest unit tests are completed in the build workflow, the report is saved as an artifact associated with that particular workflow statement, branch, function and line coverage are all displayed as percentages.
--------------------------------------|---------|----------|---------|---------|-----------------------------------------------
File | % Stmts | % Branch | % Funcs | % Lines | Uncovered Line #s
--------------------------------------|---------|----------|---------|---------|-----------------------------------------------
All files | 99.82 | 99.24 | 100 | 100 |
Snorting-Code | 0 | 0 | 0 | 0 |
declarations.d.ts | 0 | 0 | 0 | 0 |
expo-env.d.ts | 0 | 0 | 0 | 0 |
Snorting-Code/app | 100 | 100 | 100 | 100 |
CampusMapScreen.tsx | 100 | 100 | 100 | 100 |
IndoorMapScreen.tsx | 100 | 100 | 100 | 100 |
_layout.tsx | 100 | 100 | 100 | 100 |
index.tsx | 100 | 100 | 100 | 100 |
oauthredirect.tsx | 100 | 100 | 100 | 100 |
schedule.tsx | 100 | 100 | 100 | 100 |
Snorting-Code/components | 100 | 100 | 100 | 100 |
AccessibilityIcons.tsx | 100 | 100 | 100 | 100 |
AccessibleModeToggle.tsx | 100 | 100 | 100 | 100 |
BuildingInfoPopup.tsx | 100 | 100 | 100 | 100 |
CampusMap.tsx | 100 | 100 | 100 | 100 |
ColorAccessibilitySettingsModal.tsx | 100 | 100 | 100 | 100 |
DirectionStepsPanel.tsx | 100 | 100 | 100 | 100 |
IndoorPOIFilter.tsx | 100 | 100 | 100 | 100 |
IndoorPOIOverlay.tsx | 100 | 100 | 100 | 100 |
IndoorRouteOverlay.tsx | 100 | 100 | 100 | 100 |
IndoorSVGOverlay.tsx | 100 | 100 | 100 | 100 |
NavigationBar.tsx | 100 | 100 | 100 | 100 |
NextClassDirectionsPanel.tsx | 100 | 100 | 100 | 100 |
OutdoorPOIFilter.tsx | 100 | 100 | 100 | 100 |
POIListPanel.tsx | 100 | 100 | 100 | 100 |
POIRangeSelector.tsx | 100 | 100 | 100 | 100 |
ScheduleCalendar.tsx | 100 | 100 | 100 | 100 |
SheetContainer.tsx | 100 | 100 | 100 | 100 |
ShuttleBus.tsx | 100 | 100 | 100 | 100 |
ShuttleBusTracker.tsx | 100 | 100 | 100 | 100 |
ShuttleSchedulePanel.tsx | 100 | 100 | 100 | 100 |
ShuttleUnavailableBanner.tsx | 100 | 100 | 100 | 100 |
StrategyModeSelector.tsx | 100 | 100 | 100 | 100 |
ZoomableView.tsx | 100 | 100 | 100 | 100 |
Snorting-Code/constants | 100 | 100 | 100 | 100 |
buildings.ts | 100 | 100 | 100 | 100 |
campuses.ts | 100 | 100 | 100 | 100 |
indoorPOI.ts | 100 | 100 | 100 | 100 |
outdoorPOI.ts | 100 | 100 | 100 | 100 |
poiRange.ts | 100 | 100 | 100 | 100 |
semesterConfig.ts | 100 | 100 | 100 | 100 |
shuttle.ts | 100 | 100 | 100 | 100 |
strategies.ts | 100 | 100 | 100 | 100 |
theme.ts | 100 | 100 | 100 | 100 |
type.ts | 100 | 100 | 100 | 100 |
usabilityConfig.ts | 100 | 100 | 100 | 100 |
Snorting-Code/contexts | 100 | 100 | 100 | 100 |
ColorAccessibilityContext.tsx | 100 | 100 | 100 | 100 |
Snorting-Code/hooks | 100 | 100 | 100 | 100 |
useBottomInset.ts | 100 | 100 | 100 | 100 |
useFloorData.ts | 100 | 100 | 100 | 100 |
useLocationState.ts | 100 | 100 | 100 | 100 |
useNearbyPOIs.ts | 100 | 100 | 100 | 100 |
useSheetPanResponder.ts | 100 | 100 | 100 | 100 |
useShuttleAvailability.ts | 100 | 100 | 100 | 100 |
Snorting-Code/services | 100 | 98.8 | 100 | 100 |
GoogleAuthService.ts | 100 | 100 | 100 | 100 |
GoogleCalendarCacheStore.ts | 100 | 100 | 100 | 100 |
GoogleCalendarService.ts | 100 | 100 | 100 | 100 |
GoogleDirectionsService.ts | 100 | 94.82 | 100 | 100 | 48,274-286
GooglePlacesService.ts | 100 | 100 | 100 | 100 |
Routing.ts | 0 | 0 | 0 | 0 |
SelectedCalendarsStore.ts | 100 | 100 | 100 | 100 |
TokenStore.ts | 100 | 100 | 100 | 100 |
googleEnvUtils.ts | 100 | 100 | 100 | 100 |
Snorting-Code/styles | 100 | 100 | 100 | 100 |
AccessibilityIcons.styles.ts | 100 | 100 | 100 | 100 |
BuildingInfoPopup.styles.ts | 100 | 100 | 100 | 100 |
CampusMap.styles.ts | 100 | 100 | 100 | 100 |
CampusMapScreen.styles.ts | 100 | 100 | 100 | 100 |
DirectionStepsPanel.styles.ts | 100 | 100 | 100 | 100 |
IndoorMapScreen.styles.ts | 100 | 100 | 100 | 100 |
IndoorPOIFilter.styles.ts | 100 | 100 | 100 | 100 |
IndoorPOIOverlay.styles.ts | 100 | 100 | 100 | 100 |
IndoorRouteOverlay.styles.ts | 100 | 100 | 100 | 100 |
NavigationBar.styles.ts | 100 | 100 | 100 | 100 |
NextClassDirectionsPanel.styles.ts | 100 | 100 | 100 | 100 |
OutdoorPOIFilter.styles.ts | 100 | 100 | 100 | 100 |
POIListPanel.styles.ts | 100 | 100 | 100 | 100 |
POIRangeSelector.styles.ts | 100 | 100 | 100 | 100 |
ShuttleBar.styles.ts | 100 | 100 | 100 | 100 |
ShuttleBusTracker.styles.ts | 100 | 100 | 100 | 100 |
ShuttleSchedulePanel.styles.ts | 100 | 100 | 100 | 100 |
ShuttleUnvailableBanner.styles.ts | 100 | 100 | 100 | 100 |
Snorting-Code/utils | 99.21 | 97.2 | 100 | 100 |
IndoorMapComposite.ts | 100 | 100 | 100 | 100 |
buildingSearch.ts | 100 | 100 | 100 | 100 |
continueIndoors.ts | 100 | 100 | 100 | 100 |
destinationIndoorLeg.ts | 100 | 100 | 100 | 100 |
distance.ts | 100 | 100 | 100 | 100 |
findBuildingByCode.ts | 100 | 100 | 100 | 100 |
floorPlanSvgSources.ts | 100 | 100 | 100 | 100 |
indoorAccess.ts | 100 | 100 | 100 | 100 |
indoorBuildingPlan.ts | 100 | 100 | 100 | 100 |
indoorExit.ts | 100 | 100 | 100 | 100 |
indoorMapScreenHelpers.ts | 100 | 100 | 100 | 100 |
indoorNavigation.ts | 100 | 97.6 | 100 | 100 | 95-96,98
indoorPOI.ts | 100 | 100 | 100 | 100 |
indoorPathFinding.ts | 96.36 | 90.72 | 100 | 100 | ...66,179-180,189-190,253,258-266,273,300-301
indoorRoomSearch.ts | 100 | 100 | 100 | 100 |
mapAssets.ts | 100 | 100 | 100 | 100 |
parseCourseEvents.ts | 100 | 100 | 100 | 100 |
pointInPolygon.ts | 100 | 100 | 100 | 100 |
routeParams.ts | 100 | 100 | 100 | 100 |
routeTransition.ts | 100 | 100 | 100 | 100 |
shuttleAvailability.ts | 96 | 96.42 | 100 | 100 | 21
usabilityAnalytics.ts | 100 | 100 | 100 | 100 |
--------------------------------------|---------|----------|---------|---------|-----------------------------------------------
Test Suites: 72 passed, 72 total
Tests: 1491 passed, 1491 total
In SonarQube, we can also see the main branch code coverage:

| User Story | Acceptance Test | Status |
|---|---|---|
| US-1.1 | AT-1.1 | Confirmed |
| US-1.2 | AT-1.2 | Confirmed |
| US-1.3 | AT-1.3 | Confirmed |
| US-1.4 | AT-1.4 | Confirmed |
| US-1.5 | AT-1.5 | Confirmed |
| User Story | Acceptance Test | Status |
|---|---|---|
| US-2.1 | AT-2.1 | Confirmed |
| US-2.2 | AT-2.2 | Confirmed |
| US-2.3 | AT-2.3 | Confirmed |
| US-2.5 | AT-2.5 | Confirmed |
| US-2.7 | AT-2.7 | Confirmed |
| User Story | Acceptance Test | Status |
|---|---|---|
| US-2.4 | AT-2.4 | Confirmed |
| US-2.6 | AT-2.6 | Confirmed |
| US-3.1 | AT-3.1 | Confirmed |
| US-3.2 | AT-3.2 | Confirmed |
| US-3.3 | AT-3.3 | Confirmed |
| US-3.4 | AT-3.4 | Confirmed |
| User Story | Acceptance Test | Status |
|---|---|---|
| US-4.1 | AT-4.1 | Confirmed |
| US-4.2 | AT-4.2 | Confirmed |
| US-4.3 | AT-4.3 | Confirmed |
| US-4.4 | AT-4.4 | Confirmed |
| US-4.6 | AT-4.6 | Confirmed |
| User Story | Acceptance Test | Status |
|---|---|---|
| US-4.5 | AT-4.5 | Confirmed |
| US-4.7 | AT-4.7 | Confirmed |
| US-4.8 | AT-4.8 | Confirmed |
| US-5.1 | AT-5.1 | Confirmed |
| US-5.2 | AT-5.2 | Confirmed |
ALL Acceptance Tests are CONFIRMED
System/End-2-End Testing is done via Maestro. The main E2E Test for the previous sprint implemented User Story 1.3, Toggle between campus.
Step 1: Open the app.
Step 2: Confirm the presence of "Select a campus" and select "SGW Campus"
Step 3: Confirm and select "Loyola"
Step 4: Confirm and select "SGW"
Step 5: Confirm that "Loyola" is still present

E2E Test for User story 1.5: Show additional information per building with pop-up

E2E Test for User Stories
2.1: Select start and destination buildings
2.2: Set user location as automatic starting point
2.3: Show outdoor directions on the map
2.4: Choose methods of transportation
-
Select start and destination buildings in navigation bar

-
Select start and destination buildings from building info popup

E2E Test for User Story 2.6: Time and location aware shuttle service view
-
Shuttle schedule panel

-
Shuttle service as transportation method between SGW and Loyola

E2E Test for User Story 2.7: Add building code labels to all campus buildings

E2E Test for User Stories:
3.1: Retrieve schedule from Google Calendar
3.2: Classroom location from schedule data
3.4: Support addition of multiple Google Calendars

E2E Test for User Story 3.3: Direction next class based current time

E2E Test for User Stories:
4.1: Show indoor maps for specific floor of a specific building
4.2: Show shortest indoor path to reach destination quickly

E2E Test for User Stories:
4.3: Accessible routes for user with disabilities
4.6: Navigation between rooms on different floors

E2E Test for User Story 4.4: Locate specific indoor rooms to find next class

E2E Test for User Story 4.5: Show indoor points of interest to find essential services

E2E Test for User Story 4.7: Navigate from indoor to outdoor directions between SGW and Loyola campuses

E2E Test for User Story 4.8: Navigate from indoor to outdoor directions between two buildings on the same campus

E2E Test for User Stories:
5.1: Show nearest outdoor points of interest based on range
5.2: Show directions for selected outdoor point of interest

E2E Test for extra accessibility feature: Color modes for color blind people

We have implemented a working campus navigation map experience and we now need real users to interact with it to validate usability, discover friction, and prioritize improvements. This section documents our investigation of usability testing methods and proposes a structured plan for running usability sessions and collecting measurable evidence.
This plan focuses on the features we have implemented so far, including campus selection, map interaction, building selection and highlighting, building information display, and outdoor route rendering.
We will evaluate usability for the current navigation map experience, including:
- Initial onboarding and first impression of the map screen
- Understanding and discoverability of the campus toggle (SGW vs Loyola)
- Map usability (panning, zooming, reading the map)
- Building interaction (selecting buildings via tap, seeing selection feedback)
- Viewing building details (pop up or building info panel behavior)
- Start and destination selection clarity
- Outdoor route generation and polyline visibility
- Responsiveness and perceived smoothness of interactions
- Error handling and recovery when a feature fails or is unclear
We will also test realistic usage conditions:
- Different phone sizes
- Android and iOS if available
- Indoor lighting and outdoor lighting conditions
The main user group for the app will be Concordia University students trying to navigate the campus, conforming to the following characteristics.
- Gender: No bias. Navigational ability is not tied to gender.
- Age: 20 to 30 years old. This is the typical age of undergraduate university students, forming the large majority of on-campus users.
- Minimum Education Level: High school. As above.
- Technical expertise: Comfortable with smartphone use in the context of a large city like Montreal.
- Campus familiarity: A mix of first-year students and non-first-year students.
The test group can be selected randomly from Concordia University students, hence the characteristics of the test group will match that of the main user group. This ensures that any issues found during usability testing reflect the issues of the main user base.
Before collecting data, we define what “good usability” looks like for this version of the app. These goals give us concrete targets for metrics, observations, and improvement decisions.
- Users can understand what the map screen is for within a few seconds.
- Users can find a building quickly with minimal confusion.
- Users can switch campuses without needing instructions.
- Users can set a start and destination without mixing them up.
- Users can generate and view an outdoor route clearly.
- Users feel the experience is smooth, intuitive, and consistent.
We will treat these as target thresholds to aim for. If we fail them, that indicates usability issues worth addressing.
-
Time to complete key tasks is within reasonable limits:
- Find a specific building: under 10 seconds
- Toggle campus successfully: under 5 seconds
- Generate a route between two buildings: under 30 seconds
-
Task completion rate:
- At least 80 percent of participants complete each core task without help
-
Error rate:
- Fewer than 20 percent of participants make a major error per task
-
Satisfaction:
- Average rating of at least 4 out of 5 on overall ease of use
-
Standardized score:
- Average SUS score of at least 70 out of 100
To objectively evaluate usability, we will use the following quantitative metrics:
-
Task success rate The number of users who are capable of completing a task in a predetermined length of time.
-
Time on task The amount of time each user spends on a task.
-
Touch count The amount of times a user interacts before they complete the goal.
-
Frustration index A scale of 1 to 5 used to rate the experience of completing the task with 1 being not frustrated and 5 being very frustrated.
These metrics will be recorded per task and analyzed across participants.
- Pros:
- Visualizes the "Frustration Index" by showing exactly where users tap
- Provides heatmaps of building selections.
- Cons:
- Can impact app performance (FPS) on older devices
- High data usage for mobile users.
- Pros:
- Lightweight
- Great for quantitative data
- Free tier is very generous.
- Cons:
- No visual context
- You see that a user failed to generate a route, but not why.
- Pros:
- 100% accurate representation of the user experience
- Captures verbal "think-aloud" feedback if audio is on.
- Cons:
- Requires manual review (very time-consuming)
- Large file sizes; potential privacy concerns for users.
- Pros:
- Fast to set up
- Automatically calculates averages for your SUS and custom scores
- Easy to share
- Cons:
- Relies on user memory (subjective)
- Users might skip open-ended questions if the survey feels too long.
- For heatmaps on mobile applications, we will use Smartlook, which has a React Native SDK that is compatible with Expo.
- Screen recordings can also be used to track user interactions when using simulators.
- Firebase Analytics may be used for event based logging of user actions such as campus toggling, building selection, and route generation.
We will use a mixed methods approach:
-
Task based usability testing with observation
-
Survey based feedback including:
- Custom feature specific survey questions
- System Usability Scale (SUS) to produce a score out of 100
-
Optional instrumentation with analytics tools to collect objective behavior metrics:
- Firebase Analytics for event based tracking
- Smartlook for session replay and heatmaps
This plan supports both qualitative insights and quantitative evidence.
Instead of telling users to “try the app,” we will assign realistic tasks. This allows us to measure concrete outcomes like time, completion, errors, and confusion points.
Each participant session:
- 2 minutes introduction
- 10 to 15 minutes task completion
- 5 to 7 minutes survey completion
- Optional 2 to 3 minutes short interview questions
Total time per participant: 20 to 25 minutes.
Setup for the test
- set USABILITY_TESTING_ENABLED = true in /constants/usabilityConfig
- Optional to make sure that the data is recording run:
android: adb shell setprop debug.firebase.analytics.app com.concordia.snortingcode
iOS:
xcrun simctl spawn booted log stream \
--predicate 'subsystem == "com.concordia.snortingcode"'
- read the entire testing elements
- Save the time they start
All the task should be tested in this order
Task 1: Find Current Location
- Timer starts when opening the CampusMapScreen loads
- Load Home Screen
- Locate and click on the current location button
- Timer end when the user will click on the current location button"
Task 2: Toggle campuses
- Timer starts when opening the CampusMapScreen loads
- Toggle to Loyola if in SGW, or vice versa
- Toggle back to the initial campus
- Timer end when the user toggles twice
Task 3: Find a specific building and show building details
- Timer starts when opening the CampusMapScreen loads
- View building markers (labels) and find H building
- Click on the building to see the building info
- Timer ends when the user taps the specific building
Task 4: Set the start and the destination point
- Timer starts when the building info pop up is displayed
- Set the building as the start or the destination point for the navigation
- Timer ends when the user set the building as the start or destination point
Task 5: Outdoor Navigation
- Timer starts when the building is set as the start or the destination point
- Set another building as a start or destination
- Click on the "Get Direction"
- Timer ends when the route is confirmed
Task 6: View direction steps panel
- Timer starts when the route is created
- Spot the steps panel and scroll through the steps
- Timer ends when the close the steps panel
Task 7: Shuttle schedule panel and shuttle as transportation
- Timer starts when the user click on the shuttle calendar button
- Spot the shuttle calendar button and click on it
- Discover the page: view bus stations and real-time bus pins on map
- Switch between campus and switch from current to all times
- Timer ends when closing the panel
Task 8: My course schedule and next class direction
- Timer starts when you go on the schedule page
- Go back on the home page
- Click on "My Schedule"
- Click on disconnect
- Follow the steps to connect your calendar with your Google email
- Toggle between all calendars
- Toggle between the different display: "All", "Classes", "Event"
- Go back to Home Screen and click on one of the campus.
- Spot the Next Class button and click on it
- Set your start point
- Click on "Get Direction"
- Timer ends when you route is displayed
Task 9: Get indoor directions between two rooms (different floor)
- Timer starts when the indoor Map is loaded
- Close the steps panel
- Find the H Building and click on it
- Click on "Open Indoor Map"
- Set the starting point as H-867
- Set the destination point as H-920
- Timer ends the route is displayed
Task 10: Explore indoor points of interest
- Timer starts when the Indoor Map loads
- Click on the bathroom and elevator point of interest
- Click on "Go" button
- Timer ends when the route is displayed
- Go back on Map Screen
Task 11: Navigate with accessibility directions
- Timer starts when click on Accessible button
- Set the start point to H-867
- Set the destination point to H-920
- Click on "Go" button
- Switch between floor to see the route
- Timer ends when the route is displayed
- Go back on Map Screen
Task 12: Navigate indoors from SGW to Loyola (cross-campus)
- Timer starts when NavigationBar opens
- Set H-110 as the starting point
- Set CC-124 as the destination point
- Click on continue outdoor button
- Click to go indoor again
- Timer ends when the user is indoor in the other building
- Go back on Map Screen
Task 13: Find nearby outdoor points of interest
- Timer starts when the user clicks on the POI button
- Spot the POI button
- Let the user choose for the POI they want
- Change the Range for what they want
- Click to close the POI panel
- Timer ends when the user closes the POI
Task 14: Get directions to an outdoor point of interest
- Timer starts when the user clicks on the POI button
- Spot the POI button
- Let the user choose for the POI they want
- Change the Range for what they want
- Let them click on one of the suggestion
- Click to get the direction
- Scroll though the steps
- Clear everything
- Timer ends when the route is displayed
Between participants
- Save the time they finish
- Let them fill out the survey: https://forms.gle/Znh65QoMkuBY63dz9
- Terminate the app completely to relaunch for the next participant (reset the session_ID)
For each task we record:
- Task success rate
- Time on task
- Touch count
- Frustration index
- Observational notes
Scale:
1 Strongly disagree
2 Disagree
3 Neutral
4 Agree
5 Strongly agree
Questions:
- I think that I would like to use this app frequently.
- I thought the app was easy to use.
- I found the various functions in this app were well integrated.
- I thought there was too much inconsistency in this app.
- Overall, I am satisfied with the experience.
SUS scoring produces a final score out of 100.
Scale:
1 Strongly disagree
2 Disagree
3 Neutral
4 Agree
5 Strongly agree
- How sastified are you with the outdoor navigation?
- How satisfied are you with the display of shuttle information and use?
- How satisfied are you with the Google Calendar integration?
- How satisfied are you with the indoor navigation?
- How satisfied are you with the accessibility features (e.g., accessible routes)?
- How satisfied are you with the directions to the nearest points of interest (washrooms, water fountains, stairs, elevators)?
- How satisfied are you with the display of the outdoor point of interest (restaurant, librairy, etc)?
- How satisfied are you with the direction of the outdoor point of interest?
Open Ended Question:
- What was the most confusing part of the experience?
- What was the easiest or most intuitive part?
- Any additional comments or suggestions?
Here is the link to answer the survey : https://forms.gle/Znh65QoMkuBY63dz9
From the Google form survey:
Ease of Use: 88.9% of users gave the highest possible ratings for ease of use.
Complexity: 77.8% strongly disagreed that the app was unnecessarily complex.
Consistency: 100% of respondents felt the app was consistent (Score 1/5 on the "inconsistency" metric).
Confidence: 88.8% felt very confident using the app.
Learning Curve: 77.8% disagreed that they needed to learn many things before getting started.
Strengths
Campus Switching: 100% of users found the campus toggle easy to find; 77.8% rated the transition as "very smooth."
Building Interaction: 88.9% found it very clear when a building was selected and felt the visual highlighting made sense.
Performance: 100% of users felt the app responded quickly to their actions.
Intuition: Setting destinations and switching campuses were cited as the most "Google Maps-like" and intuitive features.
Areas for Improvement
Visual Clarity (Clutter): Several users mentioned that labels and icons become "clustered" or "overlapping" when zoomed out, making it hard to read building names.
Navigation UI: The "Set as Start" vs. "Set as Destination" buttons caused initial confusion for some.
The "Set as my location (demo)" button was flagged as unnecessary or confusing.
Route Aesthetics: Some users found the direction lines (walking vs. shuttle) visually confusing or too similar in color to the building highlights (red on red).
Calendar Integration: While the Google Calendar import was praised, one user expected a direct link to their student account/student center schedule.
Overall Reception: Strongly Positive All participants rated the app 4 out of 5 or higher, with 64.3% giving a perfect score. This confirms that the application is not just functional but already delivers strong user value. At this stage, the focus shifts from validation to refinement of specific pain points.
Key Strengths Identified
- Directions to nearest POI: 92.9% rated this feature 5 out of 5, with the remainder rating it 4 out of 5. This indicates a clear and intuitive experience once users engage with it.
- Campus exploration and outdoor navigation: 78.6% of users rated both features 5 out of 5, with outdoor navigation achieving 100% satisfaction at 4 or above.
- User feedback highlights: Users consistently mentioned outdoor navigation, campus switching, and building highlighting as the easiest and most intuitive features.
- Calendar integration: 85.7% rated this feature 5 out of 5. Users specifically appreciated automatic schedule loading and next class routing, showing strong alignment with real user needs.
- Consistency: 92.9% rated inconsistency at 2 out of 5 or lower, indicating that the app feels stable and predictable.
Primary Area for Improvement: Indoor Navigation Indoor navigation was the only feature with noticeably varied ratings. While 64.3% rated it 5 out of 5, others reported lower satisfaction, revealing usability gaps.
Common issues identified:
- Lack of room suggestions during search
- Strict input formatting with no guidance or error feedback
- Unclear confirmation step after selecting a room
- Visual disconnect between indoor and outdoor navigation
- Awkward transition between indoor and outdoor modes
These findings indicate that while the feature is functional, it requires UX improvements to match the clarity and intuitiveness of the outdoor experience.

-
Task 1, 10, 11, and 13 show very low durations
- These may not accurately reflect the real user experience
- These tasks can be completed with a single tap or very quick interaction
- Firebase may have recorded completion before participants fully engaged
- The data for these tasks should be interpreted with caution
-
Task 3 shows high duration with low variability
- Most participants took a similarly long time
- Participants were instructed to find a specific building (H)
- This required scanning the entire campus map, especially for unfamiliar users
- The difficulty comes from map discoverability, not the interaction itself

-
Task 8 has the highest maximum duration
-
It is the most complex task
-
Requires navigating to the schedule screen
-
Requires connecting a Google account and selecting calendars
-
Wide range between min and max reflects different exploration behaviors
- Some participants completed it quickly
- Others explored their calendars and interface in depth
-
-
Task 11 shows high variability
- More thorough participants spent extra time evaluating accessibility
- Some focused on usability for people with disabilities
- Others completed only the minimum required interaction
-
Task 7 duration reflects total time on the shuttle page
- Not just the action of opening it
- Some participants opened and closed it quickly
- Others explored all features and schedule details
- This explains the wide variability

-
Two types of usability problems emerge:
-
Average ≈ Median
- Indicates a universal problem
- All participants experienced similar difficulty
-
Average >> Median
- Indicates a discoverability issue
- Most users succeeded, but a few struggled significantly
-
-
Task 3
- Average: 36 seconds
- Median: 35 seconds
- Almost identical values
- Indicates a universal issue
- No participants found the building popup interaction intuitive
- Highest priority fix in the app
-
Task 7
- Average: 19 seconds
- Median: 14 seconds
- Moderate gap
- Indicates consistent mild difficulty across users
- Shuttle schedule button is slightly hard to locate
-
Task 8
- Average: 41 seconds
- Median: 4 seconds
- Large gap
- Most users found the feature instantly
- A few users struggled heavily
-
Task 11
- Average: 24 seconds
- Median: 8 seconds
- Same pattern as Task 8
- Feature is usable but not always discoverable
-
Interpretation
-
Task 3 and Task 7
- Require redesign
- Problem affects all users
-
Task 8 and Task 11
- Do not need redesign
- Only need improved visibility
-

- Three distinct groups emerge:
-
Tasks: 1, 2, 7, 11
-
Observations:
- Task 1 and 2 were attempted by most participants and completed by all
- These are the strongest interactions in the app
- Task 7 and 11 had fewer attempts but still perfect completion
-
Interpretation:
- Features work very well once discovered
- Some features are more niche but still usable
-
Tasks: 3, 4, 5, 6, 8, 12, 13, 14
-
Observations:
- Consistent drop-off across all tasks
- Roughly 1 in 4 participants failed at each step
-
Interpretation:
- No single broken feature
- Indicates accumulated friction across the flow
- Users gradually lose confidence or disengage
-
Tasks: 9 and 10
-
Completion rates:
- Task 9 ≈ 60%
- Task 10 ≈ 57%
-
Observations:
-
Only tasks below all others in performance
-
Both relate to indoor navigation
- Same-floor navigation
- Accessible mode
-
-
Interpretation:
- Indoor navigation is the weakest part of the app
- Issues are consistent across both standard and accessible features
- This is the clearest redesign priority

-
The following heatmaps can be safely ignored as they correspond to internal system components:
_UICURSORACCESSORYVIEWCONTROLLERMAINACTIVITYRNSSCREEN
-
Explanation:
-
_UICURSORACCESSORYVIEWCONTROLLER- Triggered for all text input fields on iOS
-
MAINACTIVITYandRNSSCREEN- Represent top-level containers for Android and iOS
- Capture all interactions globally, providing little meaningful insight
-
-
Additional irrelevant heatmaps:
SWIFTUI.TABHOSTINGCONTROLLERUIHOSTINGCONTROLLER<ERRORVIEW>UIHOSTINGCONTROLLER<ROOTVIEW>UITRACKINGELEMENTWINDOWCONTROLLERUIVIEWCONTROLLER
-
These are automatically captured by Smartlook and do not represent actual user-facing components

-
This heatmap represents:
- The campus map
- Navigation bar
- Outdoor POI popup
- Shuttle information panel
- Associated UI controls
-
Observations:
-
High concentration of clicks on:
- Navigation buttons
- Back button
- Route and interaction controls
-
Clusters near the center of the screen:
- Likely from scrolling
- Building selection interactions
-
Clicks near the bottom:
- Likely related to opening indoor maps
-
-
Interpretation:
- Interaction patterns align with expected user behavior
- No anomalies detected

-
This heatmap represents:
- Main entry screen (SGW, Loyola, My Schedule)
-
Observations:
-
Expected clicks on:
- Campus selection buttons
- Schedule access
-
Extra taps in the upper-left corner:
- Caused by screen transitions
- Tap events overlap when navigating between screens
-
Bottom-right interactions:
- Correspond to color mode modal usage
-
-
Interpretation:
- Heatmap behavior is consistent with expected navigation flows
- No usability concerns

-
This heatmap represents:
- Indoor navigation interface
-
Observations:
-
Expected interaction clusters:
- Upper-left back button
- Input fields for room selection
- “Go” button
- Floor switching controls
-
Additional interaction cluster near center:
- Unexpected concentration of taps
-
-
Interpretation:
-
Most interactions are consistent with expected usage
-
The central cluster suggests:
- Possible confusion
- Mis-taps
- Or unclear interaction affordance
-
-
Action:
- Investigate what UI element exists in this region
- Determine if users are attempting an unsupported interaction

-
This heatmap represents:
- Color mode selection modal
-
Observations:
-
Clicks concentrated on:
- Color mode options
- Bottom-right close button
-
Some users attempted:
- Dragging the modal down to dismiss
-
-
Interpretation:
- Feature is used correctly
- Minor mismatch in expected interaction pattern (drag to close)
-
Action:
- Consider supporting swipe-to-dismiss for better UX consistency

-
This heatmap represents:
- Schedule screen accessed via “My Schedule”
-
Observations:
-
High concentration of clicks on:
- Disconnect button (used during testing flow)
- Calendar selection toggles
- Event/class filters
-
-
Interpretation:
- Interactions reflect expected task-based testing behavior
- No abnormal interaction patterns detected
Following our usability testing results, we implemented a series of targeted UI and functional refinements to bridge the gap between "working features" and an "intuitive experience." We primarily focused on the Indoor Navigation bottleneck by adding room autocomplete suggestions and relaxing rigid input requirements for classroom names, which directly addresses the high failure rates identified in our Firebase data. To improve the core map interaction, we corrected building label overlaps, added indoor zooming capabilities, and redesigned the navigation panel to ensure it no longer obstructs the active route. Additionally, we bolstered the app's inclusivity and clarity by introducing color-blind accessibility options, previewing all transportation modes simultaneously, and removing misleading "demo" buttons to provide a cleaner, production-ready interface.
Here is a list of all UI Improvement tasks.
Usability testing involves collecting behavioral data, survey responses, and potentially interaction logs. Even though this project is academic and limited in scope, we recognize that any form of user data collection requires responsible handling.
This section outlines how we ensure ethical testing practices, protect participant privacy, and minimize data risks.
Before participating in usability testing, all participants will:
- Be informed that they are testing a prototype campus navigation application.
- Be told what data will be collected, such as task completion times, survey responses, and optional analytics.
- Be informed that participation is voluntary.
- Be informed that they may withdraw at any time without explanation.
- Be informed that their responses will be anonymized.
If session replay or screen recording tools such as Smartlook are used, participants will be explicitly told that their interaction session may be recorded for usability analysis.
No participant will be recorded without clear notification and agreement.
We will only collect data that is directly relevant to usability evaluation.
Collected data may include:
- Task completion time
- Task success or failure
- Interaction events such as campus toggle or route generation
- Anonymous survey responses
- Observational notes taken by moderators
We will not collect:
- Full names
- Student IDs
- Email addresses
- GPS history outside the test scenario
- Background personal data
Analytics events will use anonymous identifiers rather than personal identifiers.
All survey results will be anonymous.
If participants are referenced in documentation, they will be labeled generically, for example:
- Participant 1
- Participant 2
- User A
Any qualitative quotes included in the wiki will not contain identifying information.
Data will only be accessible to the project team and used strictly for academic usability evaluation purposes.
If Firebase Analytics is used:
- Only event based behavioral data will be logged.
- No personal account system is implemented.
- No personal profile information will be stored.
If Smartlook or similar session replay tools are used:
- Participants will be informed in advance.
- Recordings will be used solely to understand UI confusion and interaction flow.
- Recordings will not be shared publicly.
- Recordings will not be used beyond the scope of usability evaluation.
We acknowledge that session replay tools can capture detailed interaction behavior. Therefore, their use will be limited to controlled usability sessions rather than open public deployment.
Since this application involves map features and potentially current building detection:
- We will not store raw GPS history.
- Location access will only be used during active testing.
- Location permissions will be explained clearly to participants.
- Location data will not be logged in analytics with precise coordinates tied to individuals.
The goal is to evaluate interaction design, not track real world movement patterns.
We recognize that moderated usability testing can introduce bias.
To reduce bias:
- Moderators will avoid leading participants.
- Participants will be encouraged to think aloud.
- No hints will be given unless a participant is completely stuck.
- Observations will be recorded objectively rather than interpreted emotionally.
We will also aim to include participants with different levels of familiarity with the Concordia campuses to reduce sample bias.
All reported metrics, scores, and observations will reflect actual collected data.
We will:
- Clearly separate measured results from proposed improvements.
- Avoid exaggerating usability performance.
- Document limitations such as small sample size or device constraints.
If usability issues are discovered, they will be documented transparently rather than hidden.
All collected usability data will:
- Be stored securely.
- Be retained only for the duration of the course project.
- Be deleted after project completion unless required for grading review.
No usability data will be reused for unrelated purposes.
Our usability testing approach follows these principles:
- Transparency with participants
- Minimal and relevant data collection
- Anonymization of responses
- Respect for user privacy
- Responsible use of analytics tools
- Honest and accurate reporting of findings