MVP Milestone Review Report - bounswe/bounswe2026group11 GitHub Wiki

D6. Milestone Review

D6.1 Project Status

Summary and milestone alignment

The milestone under review is the MVP target delivery dated April 7, 2026. As defined in the MVP Milestone page, the goal was to deliver a usable first version of Social Event Mapper centered on registration, event creation, discovery, participation, and host-side management. Based on the implemented backend, web, and mobile flows, together with the MVP Demo Plan, we consider the MVP milestone goals to have been achieved within the finalized MVP scope.

The delivered MVP provides a coherent end-to-end journey across all three layers:

  • authentication, session handling, and profile setup
  • profile management and participation history
  • event creation, host management, and event cancellation
  • list-based event discovery with search and filtering
  • public join flow and protected join-request approval flow
  • favorites, favorite locations, and event metrics
  • deployment support, CI/CD workflows, release packaging, and OpenAPI documentation

The alignment between the milestone goals and the implemented MVP can be summarized as follows:

MVP goal defined in the milestone Status How it was achieved
Deliver a usable first version of the platform Completed A working full-stack baseline was produced across backend, web, and mobile, and organized into a demoable story
Support user access and account setup Completed Registration/login, session handling, forgot-password flow, and profile management are implemented
Support host-side event creation and management Completed Hosts can create events, monitor their own events, review requests, and cancel events
Support event discovery and filtering Completed Users can browse events in list view, search by keyword, and filter by category, location, radius, and time
Support participation flows based on privacy rules Completed Public events allow direct joining, while protected events require a host-approved join request
Support engagement and continuity features Completed Favorites, favorite locations, profile/history pages, and event metrics are available in the MVP
Keep the product reviewable and deployable Completed The project includes release artifacts, CI workflows, deployment automation, and a dedicated MVP demo plan

Development decisions and scope changes

While the milestone goals were achieved, several important development decisions were made to keep the MVP focused and deliverable:

  • We prioritized a coherent, testable end-to-end core flow over breadth. Instead of trying to implement every long-term feature, the MVP concentrated on the shortest path from account access to discovery, participation, and host management.
  • We intentionally split platform responsibilities: web was used mainly for host-side tasks such as event creation and request management, while mobile focused on discovery, joining, favorites, and profile/history flows. This decision also matches the MVP Demo Plan.
  • Although early customer feedback emphasized both list view and map view, the team adopted a list-first discovery strategy for the MVP. This preserved the core discovery value while reducing implementation and demo risk.
  • The MVP implemented public and protected participation flows as the primary privacy model for this milestone, while invitation-heavy private-event flows were intentionally left for later work.
  • Request tracking was kept page-based in host and participant views instead of building a full real-time notification center during the MVP.
  • The team also refined some technical details during development, such as using an email OTP flow for MVP registration and recovery, and keeping the create-event flow centered on reliable point-based location input even though route-based events remain part of the broader product direction.
  • Advanced features such as discussion/comments, moderation/reporting, rich notifications, full event version history, and reconfirmation logic were treated as post-MVP extensions rather than blockers for the first usable release.

These decisions did not reduce the milestone's main value; rather, they made it possible to satisfy the actual MVP objective stated in the milestone page: delivering a usable and reviewable first version of the product.

Customer feedback, elicitation impact, and reflections

Customer and stakeholder conversations had a direct effect on both the MVP scope and the wording of the requirements. The most helpful elicitation questions were the ones that clarified behavior and priority, not just desirable features:

  • Question 12 ("How should users discover events?") was especially important because it clarified that discovery should ultimately support both list and map views. This directly shaped the product direction and led to the MVP decision to start with a stable list/search/filter flow first.
  • Question 14 ("How should the system handle event updates or cancellations?") was highly influential because it turned event lifecycle into a concrete requirement set. The answers led to explicit status handling, visibility of canceled events, and later lifecycle-oriented requirements such as versioning and reconfirmation.
  • Question 15 ("What should be the minimum features of the first version?") was the key scope-setting question. It helped separate must-have MVP flows from valuable but non-essential ideas.
  • Questions 7, 9, and 10 were also important because they transformed event pages from simple listings into richer domain objects with visible metrics, optional constraints, and clearer participation expectations.

Several stakeholder responses directly transformed the requirements and our implementation priorities:

  • The customer's request for three privacy tiers and a pending-request approval mechanism shaped the privacy and participation model. For the MVP, this became a concrete focus on public and protected flows.
  • The request that canceled events should remain visible with a clear status transformed the lifecycle model and reinforced the use of Active, Completed, and Canceled states.
  • Feedback that participant counts and save counts should be visible made event metrics, favorites, and history-based trust signals part of the MVP story rather than optional polish.
  • Stakeholder feedback on Google Maps navigation, discussion/comments, custom tags, category suggestions, event reporting, and moderation expanded and clarified the requirements set, even where some of these items were consciously deferred beyond the MVP.
  • Feedback about requirement wording and consistency also helped improve the quality of the requirements document itself, not only the feature list.

In addition to the earlier elicitation meetings, the MVP demo also produced concrete feedback for the next milestone:

  • The customer requested file attachment support for join requests, especially image or PDF upload, so that users can provide evidence for event-specific constraints.
  • The customer requested that discovery results should respect age and gender restrictions more strictly, so users should be shown events they are eligible for and ineligible events should be filtered out.
  • The customer requested a short privacy-level explanation on the create-event screen so that hosts can better understand the difference between public, protected, and private visibility/participation models before publishing an event.

These items do not block the MVP story we demonstrated, but they are valuable usability and domain-fit improvements. They were recorded as actionable feedback and will be addressed as part of the Final Milestone scope.

Our main reflection is that the most valuable feedback was the feedback that reduced ambiguity and revealed where the current MVP could better match real usage. It helped the team convert a broad product idea into a smaller but complete first release, while also identifying the highest-value refinements for the next iteration. In hindsight, the team could have documented some scope reductions even earlier and linked each major feedback item more explicitly to the corresponding requirement update; however, the milestone as a whole shows that the team was able to absorb feedback, make pragmatic scope decisions, and still deliver all core MVP goals promised in the MVP Milestone page.

D6.2 Deliverables

Deliverable Status

Completed means the feature is implemented, tested, documented, and deployed. In Progress means partially implemented or not yet consistent across all surfaces. Not Started means planned for the Final Milestone with no implementation begun.

Feature Area Status
User registration and authentication (email OTP, login, forgot-password, token rotation) Completed
Profile management (view, edit, avatar) Completed
Participation history (hosted, upcoming, completed, canceled events) Completed
Event creation with image upload, tags, and participation constraints Completed
Event cancellation and lifecycle status tracking (Active, Completed, Canceled) Completed
Proximity-based event discovery with keyword search and filtering Completed
Public event direct join and protected event join-request flow Completed
Host join-request management (view, approve, reject) Completed
Post-event ratings (participants rate events; hosts rate participants; host score visible) Completed
Event favorites and favorite locations Completed
CI/CD workflows, Docker deployment, release packaging, and OpenAPI documentation Completed
Event editing, versioning, reconfirmation flows, and meeting point support In Progress
Private events and invitation-based participation Not Started
Map-based discovery, route-based events, and mini maps in event detail Not Started
Discovery filtering by age and gender eligibility Not Started
In-app, email, and push notifications Not Started
Digital ticketing with QR-based host validation Not Started
Event reporting, moderation review, and admin panel Not Started
Discussion section, post-event comments, and join-request file attachments Not Started
Dark mode, bilingual Turkish/English support, and Sentry error monitoring Not Started

Evaluation and Reflection

All MVP feature deliverables are completed and the team enters the Final Milestone with a working, deployed baseline. Every Final Milestone item is a net-new extension, not a catch-up. The hardest cross-layer work — auth plumbing, CI, deployment, location-based search — is already in place.

Two MVP decisions create forward-looking obligations. The list-first discovery approach means the Final Milestone must deliver a credible map view; the backend proximity queries are already compatible with a map client, so the effort is concentrated on the frontend and mobile layers. The page-based request tracking means notifications must be introduced in the Final Milestone to avoid users having to poll for status changes — this should be planned early given its cross-cutting scope.

Additionally, three customer requests from the MVP demo (file attachments on join requests, age/gender eligibility filtering in discovery, privacy-level helper text on the create-event screen) were not in the original Final Milestone plan and need to be incorporated.


UX Design

Key domain-driven design decisions made during the MVP:

Privacy-aware join interaction. The join call-to-action changes based on event type: "Join" for public events, "Request to Join" for protected events. Pending, approved, and rejected request states are visible to the participant, and hosts see a dedicated approval queue. This makes the access control model legible without documentation.

Location as a first-class primitive. Discovery is proximity-first — every query is anchored to a coordinate and radius. Favorite locations (up to 3 per user) let users set named anchors such as "Home" or "Work" as their search center, reducing repetitive input. This reflects that social events are inherently local.

Native date and time pickers on event creation. The create-event form uses the platform's native date and time pickers rather than custom components. This keeps the interaction familiar and avoids reinventing time input in a domain where precise scheduling matters.

No QR scanning on web. QR-based ticket validation is planned for the Final Milestone on mobile only. Implementing a camera-based QR scanner in a web browser is poor UX and unnecessary when mobile devices are the natural tool for in-person validation at events.

Event lifecycle visibility. Completed and canceled events remain visible with a clear status indicator rather than disappearing. This supports participation history and was directly requested by stakeholders so participants can understand the full arc of events they were involved in.

Trust signals surfaced at discovery depth. Host score, participant count, and favorite count are returned with discovery results — not only on the event detail page. Users can evaluate event quality at the moment of browsing without navigating further.


API Documentation

The API is documented using OpenAPI 3.1 specifications organized by domain area.

OpenAPI specification files:

Domain File
Authentication docs/openapi/auth.yaml
Events docs/openapi/event.yaml
Profile docs/openapi/profile.yaml
Favorite Locations docs/openapi/favorite_location.yaml
Categories docs/openapi/category.yaml

Example 1 — Registration flow

Step 1: Request an email OTP.

POST /api/auth/register/email/request-otp
Content-Type: application/json

{ "email": "[email protected]" }

Response 202 Accepted:

{
  "status": "accepted",
  "message": "If the email can be registered, an OTP has been sent."
}

Step 2: Verify the OTP and create the account.

POST /api/auth/register/email/verify
Content-Type: application/json

{
  "email": "[email protected]",
  "otp": "482951",
  "username": "trailrunner42",
  "password": "StrongPassword123"
}

Response 200 OK:

{
  "access_token": "<jwt>",
  "refresh_token": "<opaque>",
  "user": {
    "id": "2df86e13-2d2b-4dca-8d60-f6d9f8d6bb1d",
    "username": "trailrunner42"
  }
}

Example 2 — Event discovery (proximity search)

GET /api/events?lat=41.0082&lon=28.9784&radius_km=10&category_id=7&sort_by=DISTANCE&limit=20
Authorization: Bearer <access_token>

Response 200 OK (excerpt):

{
  "items": [
    {
      "id": "5f81d6de-8cb4-4639-8f60-7c9f55456121",
      "title": "Istanbul Trail Run",
      "privacy_level": "PROTECTED",
      "status": "ACTIVE",
      "start_time": "2026-05-01T08:00:00+03:00",
      "approved_participant_count": 12,
      "favorite_count": 8,
      "distance_km": 4.2,
      "is_favorited": false,
      "category": { "id": 7, "name": "Outdoors" },
      "host": {
        "username": "host_user",
        "display_name": "Host User",
        "host_score": { "final_score": 4.18, "hosted_event_rating_count": 9 }
      },
      "location": {
        "type": "POINT",
        "address": "Belgrad Forest, Istanbul",
        "point": { "lat": 41.1722, "lon": 28.9744 }
      }
    }
  ],
  "page_info": {
    "next_cursor": "eyJpZCI6IjVmODFkNmRlIn0",
    "has_next_page": true
  }
}

Example 3 — Submit a join request (protected event)

POST /api/events/5f81d6de-8cb4-4639-8f60-7c9f55456121/join-request
Authorization: Bearer <access_token>
Content-Type: application/json

{ "message": "I have trail shoes and run weekly." }

Response 201 Created:

{
  "id": "a1b2c3d4-...",
  "status": "PENDING",
  "created_at": "2026-04-11T10:30:00+03:00"
}

Host approves the request:

POST /api/events/5f81d6de-8cb4-4639-8f60-7c9f55456121/join-requests/a1b2c3d4-.../approve
Authorization: Bearer <host_access_token>

Response 200 OK:

{ "status": "APPROVED" }

Example 4 — Save a favorite location

POST /api/me/favorite-locations
Authorization: Bearer <access_token>
Content-Type: application/json

{
  "name": "Home",
  "address": "Kadıköy, Istanbul",
  "lat": 40.9917,
  "lon": 29.0277
}

Response 201 Created:

{
  "id": "550e8400-e29b-41d4-a716-446655440000",
  "name": "Home",
  "address": "Kadıköy, Istanbul",
  "lat": 40.9917,
  "lon": 29.0277
}

D6.3 Requirements

The following requirements are addressed in the MVP:

  • 1.1.3, 1.1.4, 1.2.1-1.2.3, 1.3.1
  • 2.1.1-2.1.5, 2.2.2, 2.3.1-2.3.3, 2.4.1
  • Portions of 3.1.1-3.1.3
  • 3.2.1-3.2.3, 3.3.1,3.3.2
  • 4.1.1-4.1.4
  • 4.4.1-4.4.3
  • Portions of 4.5.1-4.5.6
  • 5.1.1
  • 5.2.1, 5.3.1-5.3.2, 5.4.1-5.4.2, 5.5.1-5.5.2, 5.6.1-5.6.2,5.7.1-5.7.2
  • 6.1.1-6.1.2, 6.2.1-6.2.2
  • 6.3.1-6.3.3
  • 6.4.2
  • 8.1.1,8.3.1
  • 11.1.1-11.1.2, 11.2.1
  • 13.1.1-13.1.2

D6.4 Testing

D6.4.1 Test plan & strategy

The detailed testing strategy, coverage matrix, user acceptance criteria, and acceptance threshold are documented on the following page:

The project uses:

  • backend unit tests
  • backend integration tests with PostgreSQL and testcontainers
  • frontend unit/UI tests with Vitest
  • mobile tests with Jest
  • manual API testing through Swagger UI
  • manual UI testing for end-to-end demo flows

D6.4.2 Test Execution Reports

Backend API: Backend Coverage HTML Report

Web App: Web App Frontend Coverage HTML Report

Mobile App: Mobile Coverage HTML Report | XML

D6.4.3 Impact Analysis

The impact analysis is based on the project’s Test Plan & Coverage page, the configured CI workflows, and the generated backend, frontend, and mobile test reports.

Coverage:

  • Automated tests exist for backend, frontend, and mobile, and core MVP flows are covered by a combination of automated and manual testing.
  • The backend report uses Go’s built-in statement coverage tooling and shows 41.3% total statement coverage. It provides detailed file-level and line-highlighted visibility into covered and uncovered code paths.
  • Backend coverage is strongest in several core business and handler areas, including event validation, join request handling, participation logic, rating helpers, authentication validation, event handlers, image upload handlers, middleware, and favorite location handling.
  • The backend report also reveals lower or missing automated coverage in some infrastructure and bootstrap-oriented files, such as server startup, database initialization, JWT adapters, PostgreSQL repository adapters, and some wiring/configuration paths. These areas are partly exercised through integration tests and deployment/manual verification, but they remain candidates for stronger automated coverage.
  • The frontend report covers 44 files and shows 21.1% statement coverage, 52.7% branch coverage, 32.2% function coverage, and 21.1% line coverage.
  • Frontend coverage is concentrated in important MVP-facing areas such as app routing, authenticated API client behavior and token refresh handling, validation utilities, event status helpers, image resize logic, event detail view-model behavior, favorites state management, my-events state handling, and key event page rendering behavior.
  • The frontend report also shows comparatively lower coverage in some shared shell components, route guards, logo/branding components, and other UI structure files that are less directly exercised by the current automated suite.
  • The mobile test suite covers utility functions, API/service logic, session management, authentication flows, event creation and detail flows, favorite events/locations, profile flows, and selected UI components.
  • The generated mobile coverage report shows 75.55% statement coverage, 59.92% branch coverage, 66.50% function coverage, and 77.12% line coverage.

Bug Detection:

  1. Backend CI includes static checks, linting, unit tests, integration tests, and Docker image verification.
  2. Backend tests help detect regressions in authentication, event discovery and validation, join request handling, participation, ratings, image upload, favorite locations, and HTTP handler behavior.
  3. Frontend tests help detect regressions in routing, token refresh behavior, validators, event detail interactions, favorites state, my-events filtering, and selected page-level rendering behavior.
  4. Mobile tests help detect regressions in auth, discovery, event detail, favorites, profile, validation, and participation flows.
  5. The mobile test execution report confirms that all mobile tests passed successfully: 29 test suites and 352 tests passed.
  6. The mobile XML report provides a machine-readable execution summary with 352 tests, 0 failures, and 0 errors.

Readiness:

  • The system is ready for MVP-level review and demo of the core user journeys.
  • The backend, frontend, and mobile reports provide concrete evidence that important business flows are covered by automated tests.
  • The passing automated suites provide confidence that the main implemented flows behave as expected across the three surfaces.
  • Remaining risks are mostly related to limited automated coverage for some UI edge cases, platform-specific mobile behavior, infrastructure/bootstrap code, and flows that still require manual end-to-end validation.

D6.5 Planning and Team Process

Description

The team used a structured process centered around:

  • GitHub issues and pull requests as the main technical coordination mechanism
  • WhatsApp for quick communication and urgent coordination
  • Google Meet for weekly team meetings
  • subgroup ownership for backend, web frontend, mobile, and DevOps responsibilities
  • wiki-based documentation for planning, meeting notes, and milestone artifacts
  • CI/CD workflows for continuous validation and deployment support

Evaluation

The team process appears disciplined and well documented. Ownership was visible, planning artifacts were produced on time, and the demo itself was prepared with role assignments and fallback planning.

The clearest strengths were:

  • documented communication and planning
  • clear sub-team ownership
  • regular review through PRs and CI
  • a demo plan tied directly to implemented MVP flows

The main improvement areas are:

  • recording milestone feedback more explicitly after reviews
  • keeping requirements synchronized with actual implementation choices
  • attaching formal test reports instead of only describing the strategy

Links to screenshots, documents

D7. Individual Contributions

Team Member Subgroup Individual Contribution Page
Buğra Keser Mobile MVP Milestone Individual Contribution: Buğra Keser
Sevde Pekköse Mobile MVP Milestone Individual Contribution: Sevde Pekköse
Oğuz Özer Frontend MVP Milestone Individual Contribution: Oğuz Özer
Mehmet Akif Yıldırım Backend MVP Milestone Individual Contribution: Mehmet Akif Yıldırım
Cansu Er Mobile MVP Milestone Individual Contribution: Cansu Er
Emine Türk Frontend MVP Milestone Individual Contribution: Emine Türk
Utku Yiğit Demir Backend MVP Milestone Individual Contribution: Utku Yiğit Demir
Mehmet Kaan Ünsel Backend & DevOps MVP Milestone Individual Contribution: Mehmet Kaan Ünsel
⚠️ **GitHub.com Fallback** ⚠️