Software Requirements Specification - CankayaUniversity/ceng-407-408-2024-2025-EmoTagger-A-Community-driven-Emotion-Annotation-Tool GitHub Wiki

Software Requirements Specification

1. Introduction

1.1 Purpose

The purpose of this document is to specify the requirements for developing a web application where users label music files based on their emotional impact. This project utilizes crowdsourcing to gather a diverse range of interpretations on the emotional content of music, ultimately creating a comprehensive dataset that reflects how different listeners respond emotionally to the same music. Through this platform, we aim to compile a dataset of 10,000 labeled music files, representing a wide spectrum of emotional reactions from users.

To maintain high levels of user engagement, the application will incorporate gamification elements, including points, badges, and levels. These features make the labeling process more enjoyable and rewarding, encouraging users to participate actively. Users will be able to track their progress, join challenges, and view their ranking on leaderboards, introducing a competitive and fun element to the experience. This approach aims to foster sustained engagement, contributing to the project's goal of gathering an extensive and accurate set of emotion-based music labels.


1.2 Scope of the Project

The core functionality of this application is to provide a platform where users can listen to curated music tracks and express their emotional responses by selecting from a range of predefined emotion categories. This process aims to capture diverse emotional feedback that will help create a comprehensive dataset, useful for understanding how different types of music affect listeners emotionally. The platform appeals to a wide range of users, from casual listeners seeking self-reflection to music enthusiasts, researchers, and those interested in exploring how music influences emotional well-being.

By allowing users to select from a set of emotion tags such as "Happy," "Sad," "Energetic," or "Relaxed," the application provides a structured, yet user-centered way of recording emotional data. This dataset serves as the foundation for future analysis and research across fields like music therapy, psychology, and data science, offering insights into patterns like common emotional responses to certain genres, tempos, or lyrical themes. It is designed to allow for in-depth exploration of how demographics and personal background may impact emotional responses to various music elements.[4]

Key Features of the App

Emotion Labeling and Survey Feedback Collection

Users listen to each music file and choose an emotion that aligns with their experience. Additionally, they have the option to provide more detailed feedback, describing how the song impacted them personally. This data captures not only immediate emotional reactions but also in-depth, nuanced responses that contribute to a richer understanding of music’s emotional impact.

Gamification Elements

By integrating points, badges, levels, and leaderboards, the app encourages ongoing participation, making the labeling process more engaging and rewarding. These elements help foster a competitive and fun environment, where users are motivated to track their progress, participate in challenges, and view their standings on leaderboards. Gamification not only boosts engagement but also aids in collecting a larger, more accurate dataset.

Interactive User Profiles and Emotion Tagging

Each user’s emotional responses and survey results are saved in a personalized profile, which includes interactive elements like mood charts, timelines, and visual graphs. This profile allows users to reflect on their emotional journey through music over time, identifying trends, such as times when they feel more relaxed or energized, or observing preferences for certain genres. Users can filter their emotional data by genre, song, or artist, helping them gain insight into how music impacts their mood.

Data Analytics and Reporting for Researchers and Administrators

Administrators and researchers can use the collected data to perform real-time emotional trend analysis. A comprehensive analytics dashboard enables detailed reporting, with options to view data filtered by demographics, music genre, emotion tags, and more. Advanced tools like heat maps and emotion-based charts provide a visual representation of emotional trends, helping to identify patterns and refine music selection or emotional tags. The data insights derived from this analysis could prove valuable in music therapy, entertainment, advertising, and other fields interested in the psychological impact of music.

Personalized Music Recommendation Engine

Based on users' emotional responses, the app will offer tailored music recommendations. For instance, if a user frequently tags songs as "Energetic" or "Happy," the system will recommend more upbeat tracks that align with these emotions. Conversely, if the user selects "Calm" or "Reflective," they will receive suggestions for more introspective or soothing music. The recommendation engine adapts over time, becoming increasingly personalized as it learns from users' ongoing feedback.

Community and Social Features

To foster a sense of community, the app includes social features where users can connect with friends, share emotional responses, compare profiles, and join discussions on how music influences emotions. Features like shared playlists or collaborative surveys, where users create and share music lists focused on specific emotions (e.g., "Relaxing Tunes" or "Energetic Beats"), add an interactive and social aspect to the platform, enhancing user engagement.

Privacy and Data Security

Given the sensitivity of emotional and personal data, robust security measures are implemented. The application will comply with data privacy regulations (e.g., GDPR), ensuring responsible data handling practices. Users have control over their data and may delete their accounts if desired.

This document provides essential guidelines for developers, project managers, and stakeholders to ensure clarity on the project’s purpose, functions, and objectives. It aligns all phases of development to meet user needs and deliver an engaging, user-centered experience. Ultimately, this project aims to gather valuable emotional data and enhance users’ understanding of how music affects their emotions.


:


1.3 Glossary

Term Definition
AI (Artificial Intelligence) The simulation of human intelligence by machines, often used in data analysis, pattern recognition, and emotion prediction. AI enables EmoTagger to process and analyze emotional data to improve recommendations.
API (Application Programming Interface) A set of protocols and tools allowing different software applications to communicate and share data, enabling EmoTagger to integrate with third-party services like music platforms.
Annotation The process of labeling or tagging data (in this case, music) with relevant information, such as emotions, to make it useful for machine learning and analytics.
Backend The server-side part of EmoTagger, where data processing, storage, and retrieval are managed. This supports functionalities such as data storage, emotion analysis, and user management.
Crowdsourcing The process of gathering input from a large group of people online, used in EmoTagger to collect emotional labels from a diverse user base, enhancing the emotional dataset and improving machine learning models through varied perspectives.
Community Engagement A feature allowing users to interact, share, and discuss their emotional responses to music within EmoTagger, creating a social and interactive experience.
Classification In the context of EmoTagger, the grouping of music based on emotions, moods, or other factors for easier recommendation and analysis.
Data Annotation Specifically for audio and music, this involves categorizing and tagging audio files based on predefined emotional criteria, which becomes valuable training material for AI and ML applications.
Data Privacy Measures ensuring that user information and emotional responses are protected from unauthorized access, adhering to legal and ethical standards.
Data Repository A centralized storage within EmoTagger that contains emotional responses and related data for analysis and visualization.
DBMS (Database Management System) Software used to create, manage, and interact with databases, where EmoTagger’s user and survey data is stored and managed.
Emotion Annotation Tagging songs with specific emotions, such as happiness or sadness, based on user feedback. This helps in understanding and analyzing the emotional effects of music.
Emotion Spectrum A range of emotions (e.g., joy, calmness, nostalgia) that users can select to describe their reactions to a song, allowing EmoTagger to capture complex emotional responses.
Encryption The process of encoding data to protect it during transmission or storage, ensuring that only authorized users can access sensitive information within EmoTagger.
Frontend The client-side or user-facing part of EmoTagger, where users interact with the application through interfaces such as forms, buttons, and visualizations.
FTP (File Transfer Protocol) A network protocol used to transfer files between computers, which may be used in managing EmoTagger’s server files.
Gamification Using game-like elements (e.g., points, badges, challenges) in EmoTagger to increase user engagement, motivating users to actively participate in the survey and data labeling.
Interface The visual and interactive components of EmoTagger (e.g., buttons, menus, survey forms) that allow users to interact with the application.
JSON (JavaScript Object Notation) A lightweight data format used for exchanging structured data between EmoTagger’s client and server, making it easier to send user feedback and emotional data.
Leaderboard A ranking system within EmoTagger that displays top contributors, encouraging users to engage more actively by ranking based on contributions like surveys and annotations.
Machine Learning (ML) A subset of AI that enables EmoTagger to learn from data and predict trends in emotional responses, supporting emotion analysis and improving recommendation systems.
Metadata Data about data. For EmoTagger, metadata may include song details like title, artist, and genre, which help in organizing and analyzing user responses.
Primary Emotions Basic emotions such as happiness, sadness, anger, or fear, which are universally recognized and easy to identify.
Profile A personal section where users can manage their EmoTagger account, view past activity, and track engagement metrics.
Real-Time Data Processing The ability of EmoTagger to analyze and process data as soon as it is inputted, allowing users to see immediate feedback or trends.
Registered Users Individuals who create an account in EmoTagger, enabling them to participate in surveys, earn rewards, and access exclusive community features.
Scalability The ability of EmoTagger to expand its system capacity to support a growing user base and data without affecting performance.
Secondary Emotions More complex emotions, such as nostalgia or serenity, providing a nuanced view of user responses beyond basic emotions.
Sentiment Analysis The process of analyzing text or audio to determine the emotional tone, potentially used in EmoTagger to analyze free-text comments for emotional content.
Survey A set of questions provided to users to gather feedback on their emotional response to specific songs, creating a foundation for emotion annotations.
Tagging The process of labeling music with emotional markers, helping EmoTagger to categorize songs based on emotional responses for analysis and recommendations.
Trend Graphs Visual representations of patterns within EmoTagger that show how users’ emotional responses to songs change over time, allowing insights into popular emotional trends.
UI (User Interface) The visual components of EmoTagger, such as buttons, forms, and graphics, that users interact with.
User Experience (UX) The overall experience and satisfaction a user has while interacting with EmoTagger, impacted by design, usability, and system performance.
Web Application The web-based version of EmoTagger accessible through browsers, designed to offer similar functionalities to the mobile application for desktop users.
XML (Extensible Markup Language) A markup language used to structure data, often for data interchange between EmoTagger and other applications.

1.4 Overview of the Document

The EmoTagger project is an innovative and comprehensive music survey application designed to deeply explore and understand the emotional connections users have with music. By allowing users to listen to a variety of songs and tag their emotional responses, the app builds a rich, extensive, and diverse dataset of these reactions. This valuable dataset serves multiple purposes, both personal and academic, enabling detailed study and analysis of how different types of music impact individuals emotionally.

The project leverages crowdsourcing techniques to gather a wide array of emotional interpretations from a diverse and varied group of users, thereby enhancing the dataset's accuracy, cultural depth, and richness. Gamification elements are thoughtfully integrated into the platform to keep users engaged, motivated, and encouraged to contribute regularly, ensuring that the dataset remains robust, comprehensive, and high-quality.

The resulting extensive dataset supports significant advancements in AI-driven emotional music analysis, which can be applied in various areas such as personalized music recommendation systems, therapeutic music interventions, and cultural studies. By offering users a platform to reflect on, share, and analyze their emotional responses to music, EmoTagger bridges the gap between music and emotions, providing a rich, engaging, and immersive experience for anyone interested in exploring this dynamic and multifaceted relationship.

The app not only facilitates personal exploration and discovery but also contributes to academic research and understanding, making it a valuable tool for both individual users and researchers alike.

2. OVERALL DESCRIPTION

2.1 Product Perspective

The EmoTagger is a Community-driven Emotion Annotation Tool, a unique platform designed to explore and analyze the emotional impact of music on listeners. Unlike traditional music rating platforms, EmoTagger transcends basic ratings by capturing the nuanced and multidimensional emotions that music evokes. By allowing users to annotate their emotional responses to music, the product facilitates an in-depth exploration of how individual experiences and preferences shape the emotional impact of songs.

EmoTagger serves as both a survey and analytical tool, featuring a responsive website and mobile applications to provide an accessible and intuitive user experience. These platforms enable a broad audience to participate in surveys, review aggregated data, and explore music through a range of emotional perspectives. Positioned within the broader landscape of psychological and music research, EmoTagger’s insights have the potential to advance understanding in studies on the psychological effects of music.

This tool fosters a community where users can engage with music on a deeper level, sharing emotional responses and musical experiences. By aggregating data on emotional reactions to various genres, artists, and songs, EmoTagger offers valuable insights for music enthusiasts, researchers, and industry stakeholders, thereby enhancing understanding of the complex relationship between music and emotion.

2.1.1. Development Methodology

The EmoTagger project follows an agile software development approach to meet user needs effectively and deliver high-quality, scalable emotional annotation results. Agile methodology enables iterative and incremental development with frequent user feedback, allowing flexibility in the development process. The project is developed in structured, repeating sprints, with each sprint focusing on specific functionalities such as user interface development, data collection modules, data analysis tools, and gamification features.

1. Requirements Analysis

In the initial phase of development, a comprehensive analysis of the target users’ needs was conducted. Given the platform’s goal of exploring the relationship between music and emotions, specific interactions within the app, types of data collected, and processing methods were identified in detail. Based on this analysis, the core functionalities and technical requirements of the application were defined.

2. Prototyping and Design

Following the requirements analysis, the design phase began with an emphasis on creating accessible, user-friendly interfaces for the web. Initial designs were developed as low-fidelity prototypes to quickly gather user feedback, and these prototypes were iteratively refined based on responses and usability testing.

3. Modular Development

EmoTagger’s architecture is modular, comprising independent components for data collection, data processing, gamification, data analysis, and reporting. This modular structure allows each component to be tested separately and updated as needed. In line with the agile methodology, each sprint focused on completing specific modules that were then integrated and tested.

4. Gamification and Crowdsourcing Integration

To enhance user engagement, the platform incorporates gamification elements (points, badges, levels). Using a crowdsourcing approach, the application aims to reach a broad, culturally diverse audience, encouraging participation from users of different backgrounds. For these features, various interaction and reward mechanisms were designed to sustain user motivation.

5. Testing and Feedback

After each development phase, extensive testing was conducted, and user feedback was collected. Functional, user acceptance, and performance tests were performed throughout the development to ensure platform quality. The application was continuously improved and debugged based on user feedback.

6. Deployment and Maintenance

Once development was completed, the EmoTagger platform was launched to a broad user base. To ensure continuous user participation and rapid implementation of updates, maintenance and development work remains ongoing. New features are added, and existing features are refined based on user feedback and analysis.

This methodology has resulted in the successful development of EmoTagger as a dynamic, user-driven platform. The agile approach provided a flexible, user-centered development process that enhanced product quality by focusing on responsiveness to user needs and iterative improvement.

2.2 User Characteristics

2.2.1. Participants

  • 2.2.1.1. Participants must have an interest in music and emotional analysis.
  • 2.2.1.2. Participants must have basic familiarity with using web applications.
  • 2.2.1.3. Participants must have an active email account.

2.2.2. Admin

  • 2.2.2.1. Admin must have experience or interest in data management and analysis.
  • 2.2.2.2. Admin must know how to use a computer proficiently.
  • 2.2.2.3. Admin must understand basic music categorization and emotional labeling principles.

3. REQUIREMENTS SPECIFICATION

3.1 External Interface Requirements

3.1.1 User Interfaces

The user interface will be accessible via internet and website.

3.1.2 Hardware Interfaces

The hardware components required for the successful execution of the project and for the website to provide uninterrupted and fast service are as follows:

Server Hardware

  • Web Servers:

    • CPU: High performance processors are required to process user requests quickly.
    • RAM: A large amount of RAM is important for database queries and application performance.
    • Storage: SSD storage (preferably NVMe SSD) for fast access will be used for storing the song database and user tags.
    • Network Connectivity: High bandwidth and low latency network connections ensure users have a fast and uninterrupted experience.
  • Database Servers:

    • CPU: Good performance processors are required to process database queries quickly.

    • RAM: A sufficient amount of RAM ensures that database operations are performed quickly.

    • Storage: High speed and large capacity SSD storage units are used to store all songs and tags.

    • Network Connectivity: High bandwidth network connections are required for fast data transfer between database servers.

    • Backup and Restore Systems

    • Backup Servers:

      • Storage: Large capacity and reliable storage (preferably RAID configurations) is used for backup.
      • Network Connectivity: High-speed network connections enable fast backup operations.
      • Backup Software: Software that can perform automatic backup and restore operations should be used.

User Hardware

  • Browser Compatibility: Users must use modern web browsers (Google Chrome, Mozilla Firefox, Safari, Microsoft Edge, etc.) to access the website.
  • Internet Connection: A fixed or mobile internet connection is required to access the website. A low latency and high-speed internet connection improves the user experience.
  • Audio Devices: Users need audio output devices such as speakers or headphones to listen to songs.

3.1.3 Software Interfaces

The application will require an internet browser for web access. No additional external software interfaces are required.

3.1.4 Communications Interfaces

The application will require internet connectivity for data synchronization, but no additional external communication interfaces are necessary.


3.2 Functional Requirements

3.2.1 User Registration and Profile Management Use Case

Use Case 1: User Registration and Profile Management

  • Use Case Name: User Registration and Profile Management
  • Use Case Number: UC01
  • Authors: Begüm
  • Actors: Participant, Admin
  • Overview: Allows users (both participants and admin) to register, log in, manage their profile, and log out of the platform.
  • Related Use Cases: UC02 (Emotional Tagging of Songs), UC07 (Access Privacy Settings)

Typical Flow Description:

  • Precondition: User must access the registration or login page.

    1. User provides required information (e.g., username, email) on the registration page.
    2. System validates information and creates a user profile if details are correct.
    3. User logs into the system using credentials.
    4. Admin logs in using elevated credentials to access admin functionalities.
    5. Both users can edit profile information, such as username and profile picture, from their profile management page.
    6. User logs out to end the session.
  • Post-condition: User’s profile is registered, managed, or updated as needed.

Alternative Flow Description:

  • Precondition: The system detects incomplete or invalid information during registration.

    1. System prompts user to re-enter missing or correct invalid information.
    2. User submits corrected information.
  • Post-condition: User registration is completed successfully, and the system creates a new profile.

image

Figure 1: User Registration and Profile Management


3.2 Functional Requirements

3.2.2 Emotional Tagging of Songs (Use Case 2)

  • Actors: Participant
  • Description: This use case allows participants to select a song and tag it with emotions they feel, contributing to a crowdsourced emotional dataset.
Steps:
  1. Song Selection: Participant browses the song list and selects a song.
  2. Tagging Emotions: Participant tags the song with one or more emotions (e.g., happiness, nostalgia).
  3. Saving Tags: System saves the emotional tags with the song's metadata.
  4. Updating Emotional Profile: System updates the emotional profile of the song based on user contributions.
  • Post-condition: The emotional profile of the song is updated with the aggregated tags from all participants.

  • Alternative Flow:

    • If the participant makes an error while tagging a song, they can undo a tag, and the system updates the emotional profile accordingly.

image

Figure 2 Emotional Tagging of Songs


3.2.3 View Emotional Trends (Use Case 3)

  • Actors: Participant
  • Overview: This use case enables a participant to view the aggregated emotional responses for a song, providing insight into how other users emotionally respond to the same music.
Steps:
  1. Song Selection: Participant selects a song they are interested in.
  2. Retrieve Emotional Data: The system retrieves aggregated emotional data for the selected song.
  3. Display Emotional Trends: Emotional trends are displayed in visual formats like pie charts or bar graphs.
  4. Optional Filtering: Participant can filter data by criteria like demographics or time range.
  • Post-condition: The participant views the emotional trends of the song.

  • Alternative Flow:

    • If there aren't enough emotional tags, the system notifies the participant to add a tag or try another song.

image

Figure 3 View Emotional Trends


3.2.4 Crowdsourced Data Collection (Use Case 4)

  • Actors: Participant, Admin
  • Overview: This use case describes how participants contribute emotional tags for songs, which helps to build a comprehensive dataset. Admins ensure the data's quality and diversity.
Steps:
  1. Song Selection: Participant listens to a song and selects it for tagging.
  2. Tagging Emotions: Participant tags the song with predefined emotional labels.
  3. Recording Tags: The system records the emotional tags and updates the dataset.
  4. Admin Review: Admin monitors the volume and diversity of the dataset to ensure completeness.
  • Post-condition: The emotional dataset is updated with new contributions.

  • Alternative Flow:

    • If the participant faces issues while tagging, the system prompts them to correct the tags.

image

Figure 4 Crowdsourcing Emotional Tags for Songs


3.2.5 Gamification for User Engagement (Use Case 5)

  • Actors: Participant
  • Overview: This use case introduces gamification elements (e.g., points, badges, leaderboards) to encourage user participation.
Steps:
  1. Activity Participation: The participant completes activities like tagging songs or participating in challenges.
  2. Awarding Points: The system awards points for completed activities.
  3. Unlocking Badges/Levels: When milestones are met, the user unlocks badges or levels.
  4. Leaderboard Updates: The participant’s points are added to the leaderboard.
  5. Viewing Achievements: The participant can view their points, badges, and rank.
  • Post-condition: User points, badges, and leaderboard positions are updated.

  • Alternative Flow:

    • If the participant does not complete the activity, no points are awarded, but they can still view their existing progress.

image

Figure 5 Gamification for User Engagement


3.2.6 Settings and Preferences Management (Use Case 6)

  • Actors: Participant, Admin
  • Overview: This use case allows users to modify settings such as theme selection, notification preferences, and volume control.
Steps:
  1. Accessing Settings: The user opens the settings menu.
  2. Adjusting Preferences: User modifies various settings options.
  3. Saving Settings: The system saves the updated preferences for future sessions.
  • Post-condition: The user’s settings are saved and applied in future sessions.

  • Alternative Flow:

    • If the user doesn't save changes, the settings are discarded, and previous settings remain.

image

Figure 6 Settings and Preferences Management


3.2.7 Access Privacy Settings (Use Case 7)

  • Actors: Participant, Admin
  • Overview: This use case allows users to manage their privacy settings and control data sharing preferences.
Steps:
  1. Access Privacy Settings: The user navigates to the "Privacy Settings" section.
  2. Modify Data Sharing Preferences: The user enables or disables data sharing options.
  3. Update Preferences: The system validates and saves the new privacy settings.
  • Post-condition: The user's privacy settings are saved and enforced during future interactions.

image

Figure 7 Access Privacy Settings


3.2.8 Song and Emotion Data Analysis (Use Case 8)

  • Actors: Admin
  • Overview: This use case allows an admin to analyze aggregated emotional and song data through a dashboard.
Steps:
  1. Access Data Analysis Dashboard: Admin navigates to the dashboard.
  2. Retrieve Emotional Data: The system displays aggregated data.
  3. Apply Filters: Admin filters data based on criteria like genre or demographics.
  4. Generate Reports: The system displays data visualizations (charts/graphs) to support decision-making.
  • Post-condition: The admin gains insights into user behavior and emotional trends.

  • Alternative Flow:

    • If the system fails to retrieve data, the admin is notified and can retry.

image

Figure 8 Song and Emotion Data Analysis


3.3 Performance Requirements

  • The system should handle a large music database with high traffic volume, ensuring low-latency queries and fast, accurate tagging.
  • Data processing and retrieval should be optimized to maintain a smooth user experience.

3.4 Software System Attributes

  • Portability: The web application can be accessed via standard browsers.
  • Performance: Data transactions should be quick to provide timely feedback.
  • Usability: Data should be easily retrievable with clear error messages for users.
  • Adaptability: The system should accommodate data structure changes without significant updates.
  • Scalability: The system should scale to handle increasing data volume and users.

3.5 Safety Requirement

  • Data security best practices, including encryption and role-based access control, should be implemented to protect user information.
  • Backup and recovery mechanisms should be in place to safeguard against data loss.

References

  1. Agile Software Development Manifesto [Link](https://agilemanifesto.org/)
  2. Gamification Definition [Link](https://www.mindtrek.org/)
  3. Crowdsourcing [Link](https://wired.com)
  4. Emotion Recognition [Link](IEEE Signal Processing)
  5. GDPR [Link](https://gdpr-info.eu/)

⚠️ **GitHub.com Fallback** ⚠️