CENG407‐408PROJECTREPORT - CankayaUniversity/ceng-407-408-2024-2025-EmoTagger-A-Community-driven-Emotion-Annotation-Tool GitHub Wiki

CENG407-408PROJECTREPORT

Introduction

The EmoTagger system is an innovative approach designed to investigate the deep connection between music and human emotions. By addressing the various emotional responses showed up by music, the platform serves as a community-based emotion tagging tool that uses crowdsourcing to create a wide data set of emotion labels for over 100 songs. Because of the complexity of human emotional expression and differences in musical interpretation, EmoTagger will be an innovative tool for understanding how music affects emotional well-being among different individuals and cultures, also it will be like a 3rd eye as of AI that will be able to decide the genre of music and tag music after listening it so that it will help us to gather data in cheaper way.

The system uses advance data collection techniques and AI-based analysis to classify songs according to emotional responses such as "Joyful," "Melancholic," and "Vibrant." EmoTagger allows users to tag music tracks, explore emotional patterns, and interact with customized music suggestions via web platform. Containing leaderboards which will encourage users to stay engaged and stimulate community participation. This approach allows the system to generate high quality data while maintaining user focus. The foundation of EmoTagger is established in a detailed Software Requirements Specification (SRS) and Software Design Document (SDD). These documents serve as an important resource for developers and stakeholders, providing clarity and uniformity throughout the project lifecycle.

EmoTagger has direct benefits for music enthusiasts and scholars, but also for music therapy, psychological research, AI-powered music suggestion systems and AI as 3rd eye for tagging musics or any kind of metadata in the future if it can be transform into a framework. By providing deeper insight into the emotional effects of music, the system sets a new standard for innovation at the intersection of technology, music, and emotional discovery.


Abstract

Music, serving as a universal means of communication, has significantly impacted human history by profoundly connecting with people's emotional experiences. As technology progresses, digital music platforms have changed how individuals engage with music, providing a chance to discover the emotional bonds they form with tracks. Currently, methods for examining music's effects on people's emotions have become significant, particularly in studies focused on enhancing the understanding of listener psychology and preferences. Our project involves the development of a music survey application that gathers feedback from users about the songs they listen to, aiming to capture the emotional effects these songs evoke. The survey results reveal that users respond to music not only with primary emotions like happiness, sadness, or energy, but with a broader spectrum of emotions shaped by personal experiences. This highlights the multidimensional impact of music on individuals and underscores the need to analyze diverse emotional responses among users. The next phase of the project involves developing a website to make these survey-based analyses accessible to a wider audience and to enrich user experience. Through these platforms, individuals will be able to experience music from various emotional perspectives and share their musical experiences within a community. This study aims to deepen the understanding of the connection between musical preferences and emotional responses, offering new insights into the psychological impact of music based on extensive data.


Literature Review

1. Crowdsourcing and Data Annotation in Music Labeling

Crowdsourcing is an innovative method for harnessing the collective power of diverse groups worldwide through internet platforms. It enables people to participate by decomposing difficult tasks into smaller, more manageable components. It enables people to effectively contribute data, opinions, and information by decomposing difficult tasks into smaller, more manageable components. "Crowdsourcing involves seeking knowledge, goods, or services from a large body of people."[1] This approach produces richer, more responsible data and is consistent with the "wisdom of the crowd" theory. Crowdsourcing's cost-effectiveness, speed, and diversity make it increasingly valuable for large-scale data collection, giving organizations affordable access to a wide range of expertise. Most crowdsourcing projects revolve around four core concepts: gaining wisdom or knowledge through problem solving; creating new content, such as through wikis, voting on policies or people; and crowdfunding, which involves raising money for various purposes, usually by collecting small amounts from a large pool of people.[2]

By allowing individuals from various cultural and personal backgrounds to tag music according to their emotional perceptions, crowdsourcing is crucial to our project's goal of creating a rich and varied emotional dataset. "By improving the dataset, learning applications can become more effective and personalized. Using a diverse emotional dataset improves the accuracy and efficiency of machine learning and AI applications. This allows us to better analyze different emotional interpretations."[3] We hope to achieve a wide range of emotional interpretations by requesting contributions from a diverse user base, reflecting the ways in which different audiences perceive music. Crowdsourced data labeling harnesses the power of the crowd to expedite the process. Crowdsourcing platforms engage a large-scale diverse pool of human annotators from around the world, each contributing their skills and knowledge. [4] The dataset will be enhanced by this inclusivity, increasing its value for machine learning applications that need a variety of emotional data. Crowdsourcing platforms engage a large-scale diverse pool of human annotators from around the world, each contributing their skills and knowledge. A greater diversity of labelers can offset the potential for unconscious biases of any particular types of people to influence datasets.[5]

2. How Crowdsourcing Works for Music Labeling

In music, data tagging is a process used for this purpose, which allows machine learning models to learn musical features. Different types of tagging, such as note recognition, chord detection, tempo analysis, and even genre classification, enable for deeper analysis of music data.[6] Crowdsourcing tagging often uses methods such as voting or surveys, allowing users to efficiently categorize content. As part of our project’s crowdsourcing implementation, users will be asked to tag each track with the emotion they believe best expresses it. Like community-based platforms such as Last.fm, this tagging method utilizes personal input from a vast audience to create a strong dataset that reflects nuanced differences while showcasing shared feelings. The quality of data is a crucial element that directly influences tagging outcomes. Improper or insufficient tagging can adversely affect the effectiveness of machine learning models.

3. Data Annotation in Music

Data annotation, specifically for audio and music, involves categorizing and tagging audio files manually based on predefined criteria. Users who assign emotions to musical tracks will provide crowdsourced input for our project's emotional labeling. This annotated data then becomes training material for AI systems, helping them recognize and classify emotions in music more accurately. This method bridges the gap between objective machine learning and subjective human experiences by utilizing crowdsourcing for AI development.

4. Music and Emotion Analysis

Recent studies have highlighted that music evokes a range of emotional states (e.g., joy, sadness, anxiety), and cultural factors influence how listeners from different backgrounds perceive these emotions. For instance, both Chinese and U.S. listeners may recognize a piece as joyful or angry, though cultural nuances affect whether they view these emotions favorably or unfavorably. For instance, applications can analyze user preferences and emotional responses to music, tailoring playlists or recommending new songs that align with the user's current emotional state. [7] Researchers can quickly and cheaply gather this subjective data using crowdsourced platforms like Amazon Mechanical Turk[8], which is a great resource for creating algorithms that can suggest music based on users' moods. As this field continues to evolve, it promises to bridge the gap between human emotions and technological advancements, leading to more intuitive and empathetic software solutions [8]

5. Practical Applications and Benefits of Crowdsourcing for Our Project

Crowdsourcing is essential to our project's creation of an extensive, culturally diverse dataset for the classification of emotional music. This variation in emotional tagging supports the development of music platform recommendation systems, facilitating customized, mood-oriented music suggestions. The importance of crowdsourcing in user-driven data gathering is additionally illustrated by its support of psychological uses, including the therapeutic application of music to evoke or manage emotions. Through using the capabilities of crowdsourcing, we can collect an extensive amount of data from people with varied cultural backgrounds and musical tastes. This varied dataset will enable us to recognize nuanced emotional reactions to music, surpassing general classifications such as "happy" or "sad." We can dive into particular emotions such as nostalgia, anger, or calmness, offering a deeper comprehension of music's emotional influence. Additionally, crowdsourcing can assist us in revealing cultural variations in music perception. Various cultures might link specific musical elements to distinct emotions. Examining the feedback of individuals from diverse cultural backgrounds allows us to understand these cultural nuances and create more precise and culturally sensitive music recommendation systems.

6. Gamification for Enhanced Engagement

To keep users motivated, our platform incorporates gamification elements such as points, badges, levels, challenges, and leaderboards. These features encourage users to contribute more labels, fostering a sense of achievement and community participation. This combination of crowdsourcing and gamification aims to maximize label accuracy, user engagement, and dataset volume. By providing immediate feedback and recognition, gamification elements incentivize users to actively participate and contribute to the project. This creates a positive cycle where users are motivated to label more music, leading to a higher-quality dataset and a more engaging user experience.

7. Conclusion

Crowdsourcing in music annotation allows us to efficiently gather emotional labels from a diverse audience, yielding a data-rich foundation for training AI systems to recognize and categorize music by mood. The gamified approach incentivizes contributions, enhancing both the volume and quality of data collected. Ultimately, our project aims to advance the field of emotional music analysis, benefiting AI research, music recommendation algorithms, and therapeutic music applications.

Software Requirements Specification

Introduction

1.1 Purpose

This document aims to outline the requirements for creating a web application that enables users to tag music files according to their emotional effect. This project employs crowdsourcing to collect a wide variety of interpretations regarding the emotional aspects of music, resulting in an extensive dataset that encapsulates how various listeners emotionally react to identical music. Via this platform, we intend to gather a dataset of 10,000 annotated music files that reflect a broad range of emotional responses from users.

To foster strong user engagement, the app will integrate gamification features such as points, badges, and tiers. These characteristics enhance the labeling process, making it more fun and fulfilling, which motivates users to engage actively. Users can monitor their progress, participate in challenges, and see their position on leaderboards, adding a competitive and enjoyable aspect to the experience. This method seeks to promote ongoing involvement, aiding the project's objective of collecting a comprehensive and precise collection of emotion-driven music labels.

1.2 Scope of the Project

The primary feature of this application is to offer a space where users can enjoy selected music tracks and share their feelings by choosing from various predefined emotion categories. This procedure seeks to gather varied emotional responses to develop a thorough dataset, beneficial for analyzing how various music genres impact listeners' emotions. The platform attracts a diverse audience, including casual listeners looking for self-reflection, music fans, researchers, and individuals curious about how music affects emotional health.

The application offers a structured yet user-friendly method for recording emotional data by letting users choose from various emotion tags like "Happy," "Sad," "Energetic," or "Relaxed." This dataset acts as the basis for forthcoming analysis and research in areas such as music therapy, psychology, and data science, providing insights into trends such as typical emotional reactions to specific genres, tempos, or lyrical themes. It aims to enable detailed investigation into how demographic factors and individual backgrounds can influence emotional reactions to different aspects of music.

Key Features:

  • Emotion Labeling and Survey Feedback Collection: Users listen to each music file and choose an emotion that aligns with their experience. Additionally, they have the option to provide more detailed feedback, describing how the song impacted them personally. This data captures not only immediate emotional reactions but also in-depth, nuanced responses that contribute to a richer understanding of music’s emotional impact.

  • Gamification Elements: By integrating points, badges, levels, and leaderboards, the app encourages ongoing participation, making the labeling process more engaging and rewarding. These elements help foster a competitive and fun environment, where users are motivated to track their progress, participate in challenges, and view their standings on leaderboards. Gamification not only boosts engagement but also aids in collecting a larger, more accurate dataset.

  • Interactive User Profiles and Emotion Tagging: Each user’s emotional responses and survey results are saved in a personalized profile, which includes interactive elements like mood charts, timelines, and visual graphs. This profile allows users to reflect on their emotional journey through music over time, identifying trends, such as times when they feel more relaxed or energized, or observing preferences for certain genres. Users can filter their emotional data by genre, song, or artist, helping them gain insight into how music impacts their mood.

  • Data Analytics and Reporting for Researchers and Administrators: Administrators and researchers can use the collected data to perform real-time emotional trend analysis. A comprehensive analytics dashboard enables detailed reporting, with options to view data filtered by demographics, music genre, emotion tags, and more. Advanced tools like heat maps and emotion-based charts provide a visual representation of emotional trends, helping to identify patterns and refine music selection or emotional tags. The data insights derived from this analysis could prove valuable in music therapy, entertainment, advertising, and other fields interested in the psychological impact of music.

  • Personalized Music Recommendation Engine: Based on users' emotional responses, the app will offer tailored music recommendations. For instance, if a user frequently tags songs as "Energetic" or "Happy," the system will recommend more upbeat tracks that align with these emotions. Conversely, if the user selects "Calm" or "Reflective," they will receive suggestions for more introspective or soothing music. The recommendation engine adapts over time, becoming increasingly personalized as it learns from users' ongoing feedback.

  • Community and Social Features: To foster a sense of community, the app includes social features where users can connect with friends, share emotional response. Features like playlists, where users create and share music lists focused on specific emotions (e.g., "Relaxing Tunes" or "Energetic Beats"), add an interactive and social aspect to the platform, enhancing user engagement.

  • Privacy and Data Security: Given the sensitivity of emotional and personal data, robust security measures are implemented. The application will comply with data privacy regulations (e.g., GDPR), ensuring responsible data handling practices. Users have control over their data and may delete their accounts if desired.

1.3 Glossary

Term Definition
AI (Artificial Intelligence) The emulation of human intelligence by machines, used for data analysis, pattern recognition, and forecasting.
API (Application Programming Interface) A set of protocols enabling EmoTagger to connect with external services like music platforms.
Annotation The act of labeling music data with emotions to enhance machine learning and analysis.
Backend The server-side component managing data processing, storage, and retrieval.
Crowdsourcing Gathering input from a large group of users to enrich emotional datasets and refine ML models.
Community Interaction Features enabling users to share, discuss, and engage with emotional reactions to music.
Classification Grouping music based on emotions, moods, or other attributes for better recommendations and analysis.
Data Annotation Categorizing audio files based on emotions, serving as AI and ML training data.
Data Privacy Measures ensuring protection of user information and compliance with legal and ethical guidelines.
Data Repository Central storage for emotional responses and associated data, enabling analysis and visualization.
DBMS (Database Management System) Software for creating, managing, and interacting with EmoTagger databases.
Emotion Annotation Labeling songs with specific emotions like joy, sadness, or calmness.
Web Application Browser-based interface providing EmoTagger features for desktop users.
XML (Extensible Markup Language) A markup language used for organizing and exchanging data between EmoTagger and other applications.

1.4 Overview of the Document

The EmoTagger initiative is a creative and all-encompassing music polling app aimed at investigating and grasping the emotional bonds users share with music. By enabling users to hear a range of songs and label their emotional reactions, the app creates a comprehensive and varied dataset of these responses. This important dataset has various uses, both personal and scholarly, allowing for in-depth research and examination of the emotional effects that different genres of music have on people. The initiative utilizes crowdsourcing methods to collect a broad range of emotional interpretations from a diverse and varied population of users, thus improving the accuracy, cultural depth, and richness of the dataset.

Gamification features are carefully woven into the platform to maintain user engagement, motivation, and a steady encouragement to contribute, guaranteeing that the dataset stays robust, thorough, and of high quality. The resulting large dataset enables major progress in AI-based emotional music analysis, which can be utilized in multiple domains like tailored music recommendation systems, therapeutic music applications, and cultural research. EmoTagger connects music and emotions by providing users a platform to contemplate, share, and assess their feelings towards music, delivering a deep, captivating, and immersive experience for those eager to delve into this complex and varied relationship. The app aids not only in personal exploration and discovery but also enhances academic research and understanding, making it a valuable resource for individual users and researchers too.

Overall Description

2.1 Product Perspective

The EmoTagger, a community-focused Emotion Annotation Tool, is a distinctive platform created to investigate and assess the emotional effects of music on its audience. In contrast to conventional music rating platforms, EmoTagger goes beyond simple ratings by capturing the complex and varied emotions that music elicits. By enabling users to note their emotional reactions to music, the product promotes a thorough investigation of how personal experiences and tastes influence the emotional effects of songs. EmoTagger functions as both a survey and analytical instrument, offering a responsive website and mobile apps to ensure an accessible and user-friendly experience. These platforms allow a wide audience to take part in surveys, examine aggregated information, and discover music from various emotional viewpoints. Situated within the wider context of psychological and music research, EmoTagger's findings could enhance comprehension in investigations of music's psychological impacts. This platform cultivates a community where users can connect with music more profoundly, exchanging emotional reactions and musical experiences. By compiling information on emotional responses to different genres, artists, and tracks, EmoTagger provides important insights for music lovers, scholars, and industry professionals, thus deepening comprehension of the intricate connection between music and emotion.


2.1.1. Development Methodology

The EmoTagger initiative adopts an agile software development methodology to effectively address user requirements and provide high-quality, scalable emotional annotation outcomes. Agile methodology facilitates incremental and iterative development while incorporating regular user feedback, which permits adaptability in the development process. The project is executed in organized, iterative sprints, with each sprint concentrating on particular functionalities like user interface development, data collection components, data analysis tools, and gamification aspects.

1. Requirements Analysis

In the initial phase of development, a comprehensive analysis of the target users’ needs was conducted. Given the platform’s goal of exploring the relationship between music and emotions, specific interactions within the app, types of data collected, and processing methods were identified in detail. Based on this analysis, the core functionalities and technical requirements of the application were defined.

2. Prototyping and Design

Following the requirements analysis, the design phase began with an emphasis on creating accessible, user-friendly interfaces for the web. Initial designs were developed as low-fidelity prototypes to quickly gather user feedback, and these prototypes were iteratively refined based on responses and usability testing.

3. Modular Development

EmoTagger features a modular architecture, consisting of separate components for gathering data, processing data, gamifying, analyzing data, and generating reports. This modular design enables every part to be tested individually and revised when necessary. In accordance with the agile methodology, every sprint aimed to finish particular modules that were subsequently integrated and tested.

4. Gamification and Crowdsourcing Integration

To boost user engagement, the platform integrates gamification features (points, badges, levels). Employing a crowdsourcing method, the application seeks to connect with a wide, culturally varied audience, prompting involvement from users of diverse backgrounds. To maintain user motivation, different interaction and reward mechanisms were created for these features.

5. Testing and Feedback

Following each development stage, thorough testing took place, and input from users was gathered. Functional, user acceptance, and performance testing was conducted during development to guarantee the quality of the platform. The app was consistently enhanced and troubleshot according to user input.

6. Deployment and Maintenance

After the development was finished, the EmoTagger platform was introduced to a wide audience. To guarantee ongoing user engagement and prompt deployment of updates, development and maintenance activities are continuously in progress. New functionalities are introduced, and current features are enhanced based on user input and evaluation.

This approach has led to the effective creation of EmoTagger as an interactive, user-oriented platform. The agile methodology offered a flexible, user-focused development process that improved product quality by emphasizing responsiveness to user demands and continuous enhancement.


2.2 User Characteristics

2.2.1. Participants

2.2.1.1. Participants must have an interest in music and emotional analysis.

2.2.1.2. Participants must have basic familiarity with using web applications.

2.2.1.3. Participants must have an active email account.

2.2.2. Admin

2.2.2.1. Admin must have experience or interest in data management and analysis.

2.2.2.2. Admin must know how to use a computer proficiently.

2.2.2.3. Admin must understand basic music categorization and emotional labeling principles.


3. REQUIREMENTS SPECIFICATION

3.1 External Interface Requirements

3.1.1 User Interfaces

The user interface will be accessible via internet and website.

3.1.2 Hardware Interfaces

The hardware components required for the successful execution of the project and for the website to provide uninterrupted and fast service are as follows:

Server Hardware

Web Servers:

  • CPU: High performance processors are required to process user requests quickly.
  • RAM: A large amount of RAM is important for database queries and application performance.
  • Storage: SSD storage (preferably NVMe SSD) for fast access will be used for storing the song database and user tags.
  • Network Connectivity: High bandwidth and low latency network connections ensure users have a fast and uninterrupted experience.

Database Servers:

  • CPU: Good performance processors are required to process database queries quickly.
  • RAM: A sufficient amount of RAM ensures that database operations are performed quickly.
  • Storage: Rapid and spacious SSD storage devices are utilized to hold all songs and tags.
  • Network Connectivity: Fast data transfer between database servers necessitates high bandwidth network connections.

Backup and Recovery Systems

Backup Servers:

  • Storage: Reliable and spacious storage (ideally in RAID setups) is utilized for backups.
  • Network Connectivity: Rapid network connections facilitate quick backup processes.
  • Backup Software: Software capable of executing automatic backup and restoration tasks should be utilized.

User Hardware

  • Browser Compatibility: Users need to utilize current web browsers (such as Google Chrome, Mozilla Firefox, Safari, Microsoft Edge, etc.) to visit the website.
  • Internet Access: A stable or portable internet connection is necessary to reach the website. A quick and low latency internet connection enhances the user experience.
  • Audio Equipment: Individuals require audio output devices like speakers or headphones to hear music.

3.1.3 Software Interfaces

The application will require an internet browser for web access. No additional external software interfaces are required.

3.1.4 Communications Interfaces

The application will require internet connectivity for data synchronization, but no additional external communication interfaces are necessary.

3.2 Functional Requirements

3.2.1 User Registration and Profile Management Use Case

Use Case Name: User Registration and Profile Management
Use Case Number: UC01
Authors: Begüm
Actors: Participant, Admin
Overview: Allows users (both participants and admin) to register, log in, manage their profile, and log out of the platform.
Related Use Cases: UC02 (Emotional Tagging of Songs), UC07 (Access Privacy Settings)

Typical Flow Description:

Precondition: User must access the registration or login page.

  1. User provides required information (e.g., username, email) on the registration page.
  2. System validates information and creates a user profile if details are correct.
  3. User logs into the system using credentials.
  4. Admin logs in using elevated credentials to access admin functionalities.
  5. Both users can edit profile information, such as username and profile picture, from their profile management page.
  6. User logs out to end the session.

Post-condition: User’s profile is registered, managed, or updated as needed.

Alternative Flow Description:

Precondition: The system detects incomplete or invalid information during registration.

  1. System prompts user to re-enter missing or correct invalid information.
  2. User submits corrected information.

Post-condition: User registration is completed successfully, and the system creates a new profile.

Figure 1: User Registration and Profile Management image


3.2.2 Emotional Tagging of Songs Use Case

Use Case Name: Emotional Tagging of Songs
Use Case Number: UC02
Authors: Selin
Actors: Participant
Overview: Allows participants to select a song from a list, tag it with emotions they experience, and contribute to a crowdsourced emotional dataset. The system saves these emotional tags, updates the song’s emotional profile, and aggregates the data based on user contributions.
Related Use Cases: UC01 (User Registration and Profile Management), UC05 (Gamification for User Engagement)

Typical Flow Description:

Precondition: The user must log in and access the emotional tagging system.

  1. Participant selects a song from the available list.
  2. Participant tags the song with emotions (e.g., happiness, nostalgia).
  3. System saves the emotional tags along with the song data.
  4. System updates the overall emotional profile of the song based on user contributions.

Post-condition: The song's emotional profile is updated and reflects the aggregated tags provided by all participants.

Alternative Flow Description:

Precondition: The participant makes an error while tagging a song.

  1. Participant decides to undo a tag.
  2. System removes or updates the tag.

Post-condition: The incorrect or unwanted tag is successfully removed or modified, and the system recalculates the song’s emotional profile.

Figure 2: Emotional Tagging of Songs image


3.2.3 View Emotional Trends Use Case

Use Case Name: View Emotional Trends
Use Case Number: UC03
Authors: Serhat
Actors: Participant
Overview: Allows participants to view aggregated emotional responses for a selected song, providing insights into how other users have emotionally responded to the same music.
Related Use Case: UC02 (Emotional Tagging of Songs)

Typical Flow Description:

Precondition: The participant must be logged in to the platform.

  1. The participant navigates to the song library or search bar and selects a song.
  2. The system retrieves the aggregated emotional data for the selected song, showing the percentage of users who tagged the song with emotions like "happy," "calm," "energetic," etc.
  3. The system displays the emotional trends in a visual format (e.g., pie chart or bar graph).
  4. The participant can filter the data by criteria (e.g., demographics or time range).

Post-condition: The participant views the emotional trends and gains insight into the general emotional impact of the song among other users.

Alternative Flow Description:

Precondition: The system cannot retrieve the aggregated emotional data due to insufficient data points.

  1. The participant selects a song to view emotional trends.
  2. The system checks for the availability of sufficient emotional tags.
  3. The system notifies the participant that more emotional tags are needed to display trends.
  4. The participant is prompted to tag the song with their own emotional response or return to the previous screen.

Post-condition: If the participant chooses to add a tag, their response contributes to building the dataset for future trend analysis.

Figure 3: View Emotional Trends image


3.2.4 Crowdsourced Data Collection Use Case

Use Case Name: Crowdsourcing Emotional Tags for Songs
Use Case Number: UC04
Authors: Emre
Actors: Participant, Admin
Overview: Allows individuals to add emotional tags to songs, creating an extensive dataset via crowdsourcing. Participants label songs according to their emotional reactions, and the system captures this information. Admins oversee data quality and ensure dataset comprehensiveness.
Related Use Cases: UC02 (Emotional Tagging of Songs), UC08 (Song and Emotion Data Analysis)

Typical Flow Description:

Precondition: The participant must be logged in to access the emotional tagging system.

  1. The participant listens to and selects a song from the available list.
  2. The system provides tagging options based on predefined emotions (e.g., "Happiness," "Sadness").
  3. The participant assigns one or more emotional tags to the selected song.
  4. The system records the emotional tags and updates the aggregated emotional dataset.
  5. Admin reviews data volume and diversity to ensure the dataset is comprehensive.

Post-condition: The emotional dataset is updated with new participant contributions, enhancing its quality and coverage.

Alternative Flow Description:

Precondition: The participant faces issues while tagging a song.

  1. The system detects incomplete or invalid tagging input.
  2. The system prompts the participant to correct or complete the tagging.
  3. The participant resubmits valid emotional tags.

Post-condition: The dataset is updated with corrected tagging input, maintaining its integrity.

Figure 4: Crowdsourcing Emotional Tags for Songs image


3.2.5 Gamification for User Engagement Use Case

Use Case Name: Gamification for User Engagement
Use Case Number: UC05
Authors: Selin
Actors: Participant
Overview: Encourages user participation through gamification elements, such as points, badges, levels, challenges, and leaderboards. Participants earn rewards by completing activities and track their achievements through a leaderboard and badge system.
Related Use Cases: UC02 (Emotional Tagging of Songs), UC08 (Song and Emotion Data Analysis)

Typical Flow Description:

Precondition: The user must log in to access the gamification system.

  1. User starts an activity (e.g., tagging a song or participating in a challenge).
  2. System awards points and tracks progress.
  3. User unlocks badges or levels when milestones are achieved.
  4. System updates the leaderboard with points.
  5. User views achievements and leaderboard standings.

Post-condition: The participant’s points, badges, and leaderboard position are updated in real-time.

Alternative Flow Description:

Precondition: The participant fails to complete an activity or opts out.

  1. System does not award points.
  2. User views progress.

Post-condition: The system retains the current user progress without changes.

Figure 5: Gamification for User Engagement image


3.2.6 Settings and Preferences Management Use Case

Use Case Name: Settings and Preferences Management
Use Case Number: UC06
Authors: Hamza
Actors: Participant, Admin
Overview: Allows users to adjust settings such as notification preferences, theme selection, and volume controls. Adjustments are saved for future sessions.
Related Use Cases: UC01 (User Registration and Profile Management)

Typical Flow Description:

Precondition: The user must have access to the application.

  1. User accesses the Settings menu.
  2. User adjusts settings (e.g., volume control, theme, notifications).
  3. System saves settings for future sessions.

Post-condition: The user’s customized settings are stored and applied in subsequent sessions.

Alternative Flow Description:

Precondition: The user fails to save changes or exits the settings menu.

  1. System does not apply unsaved settings.
  2. User retains previous settings.

Post-condition: No changes are made to the settings, and the previous preferences remain in place.

Figure 6: Settings and Preferences Management image


3.2.7 Access Privacy Settings Use Case

Use Case Name: Access Privacy Settings
Use Case Number: UC07
Authors: Tuğkan
Actors: Participant, Admin
Overview: Allows users to control privacy settings, manage data sharing preferences, and access options related to data visibility.
Related Use Cases: UC01 (User Registration and Profile Management)

Typical Flow Description:

Precondition: User must be logged into the system.

  1. User navigates to "Privacy Settings" from their profile page.
  2. System retrieves and displays current privacy settings.
  3. User chooses preferred settings (e.g., public, private, or limited access).
  4. The system updates the user's privacy settings in the database.
  5. Admins can view and adjust privacy settings for any user if necessary.

Post-condition: The user's updated privacy preferences are saved, and future interactions adhere to the new settings.

Figure 7: Access Privacy Settings image


3.2.8 Song and Emotion Data Analysis Use Case

Use Case Name: Song and Emotion Data Analysis
Use Case Number: UC08
Authors: Tuğkan
Actors: Admin
Overview: Allows administrators to examine combined song and emotion information to understand user behavior and emotional patterns. Admins can create reports or visual representations for decision-making and trend assessment.
Related Use Cases: UC02 (Emotional Tagging of Songs), UC03 (View Emotional Trends)

Typical Flow Description:

Precondition: The admin must log in to access the analysis dashboard.

  1. The admin accesses the "Data Analysis Dashboard" through the admin panel.
  2. The system presents combined emotional information for various songs.
  3. The admin sorts the data based on specific criteria (e.g., date range, music genre).
  4. The system displays the insights in visual formats like charts or graphs.

Post-condition: The admin gains insights into emotional trends and user behavior for strategic decision-making.

Alternative Flow Description:

Precondition: The system encounters issues while retrieving aggregated data.

  1. Admin attempts to access the data analysis dashboard.
  2. The system notifies the admin of the issue (e.g., "Data unavailable").
  3. Admin retries or contacts support.

Post-condition: The admin resolves the issue and accesses the dashboard successfully at a later time. Figure 8: Song and Emotion Data Analysis image


3.3 Performance Requirements

3.4 Software System Attributes

3.4.1 Portability

EmoTagger is designed as a web application. The application can be accessed via standard internet browsers, making it compatible with any device that supports web browsing. No additional hardware integration is required.

3.4.2 Performance

The data layer should be enhanced for data retrieval and processing to guarantee efficient access to and storage of extensive datasets. Queries and data transactions ought to be optimized to reduce response time and lessen server load. The system must efficiently manage real-time data requests, offering users prompt feedback without significant delays. Any data-heavy tasks, like refreshing emotional response records, ought to be structured to finish in just a few seconds to ensure a seamless user experience.

3.4.3 Usability

The data layer must facilitate organized and transparent data access, allowing smooth communication between the user interface and the backend. Error messages regarding data ought to be informative, offering assistance on how to fix problems connected to data retrieval or input mistakes. The system must enable straightforward integration of data from outside sources if needed for future improvements.

3.4.4 Adaptability

The data layer should be adaptable to potential changes in data structure, such as adding new data fields or modifying data relationships, without requiring significant code changes. The system should be flexible to accommodate future scalability and evolving data requirements as user engagement grows.

3.4.5 Scalability

The data layer must facilitate horizontal scaling to handle rising data volume and simultaneous users without a decline in performance. The database must be optimized to manage large-scale read and write activities, guaranteeing stable performance as both the user base and dataset grow.

3.5 Safety Requirement

EmoTagger needs to adhere to data security best practices to safeguard user data and emotional reactions. This involves deploying encryption for both stored and transmitted data, along with role-based access management to block unauthorized access to information. Backup and recovery systems must be implemented to safeguard against data loss or damage.

1. Software Design Description

1.1 INTRODUCTION

1.1.1 Purpose

This document aims to delineate the plan for creating a web application that allows users to tag music files according to their emotional effect. This initiative utilizes crowdsourcing to collect a wide variety of interpretations related to the emotional aspects of music, thereby producing a thorough dataset that illustrates how various listeners emotionally react to identical music pieces. We aim to create a dataset containing over 100 labeled music files that reflect a broad range of emotional responses from users.

To sustain elevated user engagement, the app will integrate gamification features like points, badges, and tiers. These attributes seek to enhance the labeling experience, making it more enjoyable and fulfilling, thus promoting active involvement. Users can monitor their progress, participate in challenges, and see their standings on leaderboards, adding a competitive and enjoyable element to the experience. This strategy aims to promote ongoing involvement, aiding the project's objective of collecting a comprehensive and precise array of emotion-driven music labels.

1.1.2 Scope of the Project

The main feature of this app is to offer a space for users to listen to selected music tracks and convey their emotional reactions by choosing from various predefined emotion categories. This procedure seeks to gather a variety of emotional responses to build a thorough dataset, valuable for analyzing how various music genres impact listeners' emotions. The platform attracts a diverse audience, including casual listeners wanting self-reflection, music lovers, researchers, and individuals curious about the impact of music on emotional health.

The application offers a structured, user-focused method for recording emotional data by enabling users to choose from a range of emotion tags like "Happy," "Sad," "Energetic," or "Relaxed." This dataset acts as the basis for subsequent analysis and inquiries in areas such as music therapy, psychology, and data science, providing understanding of trends like typical emotional reactions to specific genres, tempos, or lyrical motifs. It is intended to facilitate a thorough investigation of how demographics and individual backgrounds might influence emotional reactions to different aspects of music.

1.2 Key features of the app include:

Emotion Labeling and Survey Feedback Collection: Users listen to every music track and select an emotion that corresponds with their feelings. Moreover, they have the opportunity to offer more specific feedback, explaining how the song affected them personally. This information reflects not just instant emotional responses but also detailed, subtle reactions that enhance the comprehension of music's emotional influence.

Gamification Elements: By incorporating points, badges, tiers, and leaderboards, the application promotes continuous involvement, enhancing the labeling experience to be more captivating and fulfilling. These factors contribute to creating a competitive and enjoyable atmosphere, encouraging users to monitor their progress, engage in challenges, and check their rankings on leaderboards. Gamification enhances engagement and also helps in gathering a bigger, more precise dataset.

Interactive User Profiles and Emotion Tagging: Each user’s emotional responses and survey results are saved in a personalized profile, which includes interactive elements like mood charts, timelines, and visual graphs. This profile allows users to reflect on their emotional journey through music over time, identifying trends, such as times when they feel more relaxed or energized, or observing preferences for certain genres. Users can filter their emotional data by genre, song, or artist, helping them gain insight into how music impacts their mood.

Data Analytics and Reporting for Researchers and Administrators: Administrators and scholars can utilize the gathered data to conduct real-time emotional trend assessments. A thorough analytics dashboard allows for in-depth reporting, offering options to analyze data segmented by demographics, music genre, emotion tags, and additional criteria. Sophisticated instruments such as heat maps and emotion-driven charts offer a visual depiction of emotional trends, aiding in recognizing patterns and improving music choices or emotional labels. The insights gained from this analysis may be useful in music therapy, entertainment, advertising, and other areas that are focused on the psychological effects of music.

Personalized Music Recommendation Engine: By analyzing users' emotional reactions, the application will provide personalized music suggestions. For example, when a user often labels songs as "Energetic" or "Happy," the system will suggest additional lively tracks that match these feelings. On the other hand, if the user picks "Calm" or "Reflective," they will get recommendations for more soothing or introspective music. The recommendation system evolves gradually, becoming more tailored as it gathers continuous feedback from users.

Community and Social Features: To promote community spirit, the app offers social elements allowing users to connect with friends, express emotional reactions, compare profiles, and participate in discussions about the impact of music on emotions. Elements such as joint playlists or cooperative surveys, where users develop and distribute music collections centered on particular feelings (e.g., "Chill Vibes" or "Upbeat Rhythms"), introduce an engaging and communal dimension to the platform, boosting user involvement.

Privacy and Data Security: Due to the sensitivity of emotional and personal information, strong security protocols are established. The application will adhere to data privacy laws (e.g., GDPR), guaranteeing responsible data management practices. Users can manage their data and have the option to delete their accounts if they wish.

This document offers crucial directions for developers, project managers, and stakeholders to guarantee understanding of the project's goals, functionalities, and aims. It coordinates every stage of development to address user requirements and provide a captivating, user-focused experience. In the end, this project seeks to collect significant emotional data and improve users’ comprehension of the impact music has on their feelings.

:

Term Definition
AI (Artificial Intelligence) The simulation of human intelligence by machines, often used in data analysis, pattern recognition, and emotion prediction. AI enables EmoTagger to process and analyze emotional data to improve recommendations.
API (Application Programming Interface) A set of protocols and tools allowing different software applications to communicate and share data, enabling EmoTagger to integrate with third-party services like music platforms.
Annotation The process of labeling or tagging data (in this case, music) with relevant information, such as emotions, to make it useful for machine learning and analytics.
Backend The server-side part of EmoTagger, where data processing, storage, and retrieval are managed. This supports functionalities such as data storage, emotion analysis, and user management.
Crowdsourcing The process of gathering input from a large group of people online, used in EmoTagger to collect emotional labels from a diverse user base, enhancing the emotional dataset and improving machine learning models through varied perspectives.
Community Engagement A feature allowing users to interact, share, and discuss their emotional responses to music within EmoTagger, creating a social and interactive experience.
Classification In the context of EmoTagger, the grouping of music based on emotions, moods, or other factors for easier recommendation and analysis.
Data Annotation Specifically for audio and music, this involves categorizing and tagging audio files based on predefined emotional criteria, which becomes valuable training material for AI and ML applications.
Data Privacy Measures ensuring that user information and emotional responses are protected from unauthorized access, adhering to legal and ethical standards.
Data Repository A centralized storage within EmoTagger that contains emotional responses and related data for analysis and visualization.
DBMS (Database Management System) Software used to create, manage, and interact with databases, where EmoTagger’s user and survey data is stored and managed.
Emotion Annotation Tagging songs with specific emotions, such as happiness or sadness, based on user feedback. This helps in understanding and analyzing the emotional effects of music.
Emotion Spectrum A range of emotions (e.g., joy, calmness, nostalgia) that users can select to describe their reactions to a song, allowing EmoTagger to capture complex emotional responses.
Frontend The client-side or user-facing part of EmoTagger, where users interact with the application through interfaces such as forms, buttons, and visualizations.
FTP (File Transfer Protocol) A network protocol used to transfer files between computers, which may be used in managing EmoTagger’s server files.
Gamification Using game-like elements (e.g., points, badges, challenges) in EmoTagger to increase user engagement, motivating users to actively participate in the survey and data labeling.
Interface The visual and interactive components of EmoTagger (e.g., buttons, menus, survey forms) that allow users to interact with the application.
JSON (JavaScript Object Notation) A lightweight data format used for exchanging structured data between EmoTagger’s client and server, making it easier to send user feedback and emotional data.
Leaderboard A ranking system within EmoTagger that displays top contributors, encouraging users to engage more actively by ranking based on contributions like surveys and annotations.
Machine Learning (ML) A subset of AI that enables EmoTagger to learn from data and predict trends in emotional responses, supporting emotion analysis and improving recommendation systems.
Metadata Data about data. For EmoTagger, metadata may include song details like title, artist, and genre, which help in organizing and analyzing user responses.
Primary Emotions Basic emotions such as happiness, sadness, anger, or fear, which are universally recognized and easy to identify.
Profile A personal area where users can oversee their EmoTagger account, review previous activity, and monitor engagement statistics.
Real-Time Data Analysis EmoTagger's capability to evaluate and process data instantly upon entry, enabling users to observe immediate responses or trends.
Registered Users Those who establish an account in EmoTagger, allowing them to engage in surveys, earn rewards, and utilize special community features.
Secondary Emotions More intricate feelings, like nostalgia or serenity, offering a detailed perspective on user reactions that goes beyond primary emotions.
Survey A series of questions given to users to obtain their feedback on emotional reactions to particular songs, establishing a basis for emotion annotations.
Tagging The act of assigning emotional labels to music, enabling EmoTagger to sort songs according to emotional reactions for evaluation and suggestions.
Trend Graphs Visual depictions of patterns in EmoTagger that illustrate the evolution of users' emotional reactions to songs, providing insights into prevalent emotional trends.
UI (User Interface) The visual elements of EmoTagger, including buttons, forms, and graphics, that users engage with.
User Experience (UX) The complete impression and contentment a user feels when engaging with EmoTagger, influenced by design, functionality, and system efficiency.
Web Application The browser-accessible version of EmoTagger, created to provide desktop users with functionalities akin to those of the mobile application.
XML (Extensible Markup Language) A markup language utilized for organizing data, frequently for data exchange between EmoTagger and various other applications.


1.4 Overview of the Document

The EmoTagger project is an innovative and comprehensive music survey application designed to deeply explore and understand the emotional connections users have with music. By allowing users to listen to a variety of songs and tag their emotional responses, the app builds a rich, extensive, and diverse dataset of these reactions. This valuable dataset serves multiple purposes, both personal and academic, enabling detailed study and analysis of how different types of music impact individuals emotionally. The project leverages crowdsourcing techniques to gather a wide array of emotional interpretations from a diverse and varied group of users, thereby enhancing the dataset's accuracy, cultural depth, and richness.

Gamification elements are thoughtfully integrated into the platform to keep users engaged, motivated, and encouraged to contribute regularly, ensuring that the dataset remains robust, comprehensive, and high-quality. The resulting extensive dataset supports significant advancements in AI-driven emotional music analysis, which can be applied in various areas such as personalized music recommendation systems, therapeutic music interventions, and cultural studies. By offering users a platform to reflect on, share, and analyze their emotional responses to music, EmoTagger bridges the gap between music and emotions, providing a rich, engaging, and immersive experience for anyone interested in exploring this dynamic and multifaceted relationship. The app not only facilitates personal exploration and discovery but also contributes to academic research and understanding, making it a valuable tool for both individual users and researchers alike.


2. ARCHITECTURE DESIGN

2.1 General Design Approach of the Project

The project aims to develop a user experience that is both intuitive and emotionally captivating. A user-focused design guarantees that the platform is easy to use, accessible, and customized to meet user requirements. Featuring a simple, minimalist design, users can effortlessly explore functionalities like search, filtering, and emotional tagging, creating interactions that are both smooth and significant.

The recommendation system utilizes crowdsourced emotional labels to provide tailored song recommendations. The backend structure effectively manages a database of over 100 songs and can expand to support future development. Responsive design guarantees the best usability on all devices, whereas real-time functionalities such as instant playback and tagging boost user engagement.

Data privacy and security are emphasized, protecting user information and building trust. Aesthetically pleasing features enhance the emotional core of the project, improving the overall user experience. Ongoing testing and feedback support continuous enhancement and feature optimization.

Analytics direct enhancement initiatives, boosting suggestions and user contentment. The modular structure facilitates the seamless integration of future upgrades, guaranteeing flexibility. Cooperation among all parties fosters a unified and creative method, providing a platform that strongly links users with music.


2.2 Architecture Design of the Webpage

The design of the webpage’s structure aims to offer an effective and smooth user experience while ensuring scalability and dependability. The system is structured into three main layers: the frontend, backend, and database.

  • Frontend Layer: Employs contemporary web technologies to create a responsive and attractive interface, guaranteeing it functions seamlessly on different devices, such as desktops, tablets, and smartphones. This layer enables user activities like searching, filtering, and tagging, providing an interactive and user-friendly experience.

  • Backend Layer: Serves as the system's foundation, processing user requests, implementing business logic, and overseeing secure connections with the database. Engineered for scalability, the backend effectively manages heavy traffic and substantial data loads, guaranteeing reliable performance. It also includes the recommendation engine that evaluates emotional tags to provide customized content to users.

  • Database Layer: Handles the storage and organization of crucial data, including song metadata, user profiles, and emotional tags. The design focuses on quick and dependable data access to facilitate real-time operations. Robust actions are taken to ensure data privacy and security, featuring encryption and restricted access protocols to safeguard user information. APIs link these layers, enabling seamless interaction and promoting modularity and flexibility. This design facilitates future feature upgrades and integrations, allowing the platform to expand while ensuring reliability and superior performance.


3. SYSTEM DESIGN

3.1 UML Diagram

UML Diagram

  • Figure 9: Class Diagram

3.1.1 User Interface

HomePage of User Interface Figure 10: HomePage of User Interface

Login Page of User Interface Figure 11: Login Page of User Interface

Register Page of User Interface
Figure 12: Register Page of User Interface

Home Page 2 of User Interface Figure 13: Home Page 2 of User Interface

Profile Page of User Interface
Figure 14: Profile Page of User Interface

Settings Page of User Interface
Figure 15: Settings Page of User Interface

Listening Page User Interface
Figure 16: Community Page of User Interface

Community Page of User Interface
Figure 17: Community Page of User Interface

3.1.2 Admin Interface

Admin Page Interface
Figure 18: Admin Page Interface

3.2 Decomposition Description

3.2.1 User Registration and Profile Management

User Registration

  • Figure 19: Activity Diagram for User Registration and Profile

Registration Sequence

  • Figure 20: Sequence Diagram for User Registration and Profile
3.2.2 Emotional Tagging of Songs

Emotional Tagging

  • Figure 21: Activity Diagram for Emotional Tagging of Songs

Tagging Sequence

  • Figure 22: Sequence Diagram for Emotional Tagging of Songs
3.2.3 View Emotional Trends

Emotional Trends

  • Figure 23: Activity Diagram for View Emotional Trends

Trends Sequence

  • Figure 24: Sequence Diagram for View Emotional Trends
3.2.4 Crowdsourced Data Collection

Crowdsourced Data

  • Figure 25: Activity Diagram for Crowdsourced Data Collection

Data Collection Sequence

  • Figure 26: Sequence Diagram for Crowdsourced Data Collection
3.2.5 Gamification for User Engagement

Gamification

  • Figure 27: Activity Diagram for Gamification for User Engagement

Engagement Sequence

  • Figure 28: Sequence Diagram for Gamification for User Engagement
3.2.6 Settings and Preferences Management
  • Figure 29: Activity Diagram for Options Menu Options Menu Settings Preferences

  • Figure 30: Sequence Diagram for Option Menu

3.2.7 Access Privacy Settings

Privacy Settings

  • Figure 31: Activity Diagram for Access Privacy Settings

Privacy Sequence

  • Figure 32: Sequence Diagram for Access Privacy Settings
3.2.8 Song and Emotion Data Analysis

Song and Emotion

  • Figure 33: Activity Diagram for Song and Emotion Data Analysis

Data Analysis Sequence

  • Figure 34: Sequence Diagram for Song and Emotion Data Analysis

3.3 Performance Requirements

The project requires optimization for a substantial music database and significant traffic levels. The site must provide a quick and smooth user experience, particularly while users search for songs, apply filters, and engage with the system. With over 100 songs and millions of user engagements, it's essential to handle database queries and user actions quickly with minimal delay. Moreover, because users will categorize songs according to the feelings they have while listening, the tagging system must operate efficiently and precisely.


3.3.1 Database and Server Performance

The site’s database needs to enable quick data retrieval during every user engagement. This necessitates a scalable, high-performance database framework. Data queries, particularly for song searches, filtering, and tagging, should deliver results quickly. The song recommendation user interaction history must be handled with minimal latency. The server infrastructure needs to be scalable to handle growing user traffic and data quantities. Load balancing and caching strategies must be applied to evenly distribute traffic among servers, guaranteeing optimal performance even in peak traffic times.


3.3.2 AI and Data Processing

The algorithms powered by AI that assess the emotional tone of music need to be both precise and quick. The AI system needs to be trained to swiftly analyze large volumes of data and generate emotion-driven predictions instantly. This procedure needs to be refined to prevent any delays, as such issues could impact the user experience negatively. The AI must have a strong capacity to associate songs with particular emotional tags, allowing users to accurately label songs while also enhancing the system’s suggestions based on these labels.


3.3.3 User Experience and Response Times

Users anticipate quick and precise outcomes when looking for songs or exploring emotional tags. Efficient search algorithms, filtering methods, and recommendation systems are crucial for delivering fast responses, preventing user frustration while navigating the site. User interactions must receive prompt feedback, and the whole platform should be tailored for excellent responsiveness on all devices, including desktops and mobile platforms. The database queries need to be optimized to guarantee fast loading times and reduced latency, which will improve the overall user experience. Enhanced query execution and indexing will boost data retrieval rates, enabling users to quickly access their preferred songs or emotional tags. By emphasizing performance tuning and front-end optimization, the platform is capable of providing a quick, smooth experience, even under high concurrent traffic. [7]


3.3.4 Security and Data Protection

User information, such as mood labels and music tastes, must be managed safely and according to standard industry privacy protocols. The system needs to guarantee that data is anonymized and safeguarded against unauthorized access, especially when handling sensitive user details. Measures for security need to be implemented to protect user data and guarantee that only authorized systems can access or alter it. The platform should likewise comply with user privacy policies and regulatory standards to guarantee secure and ethical management of data. The system can protect against possible data breaches and maintain user trust by using encryption, multi-factor authentication, and secure access procedures. Furthermore, adherence to global privacy laws like GDPR will be crucial for safeguarding user rights and guaranteeing the lawful management of personal information. [6]


3.4 Software System Attributes

3.4.1 Portability

The platform will be a website and should work with all primary operating systems (Windows, macOS, Linux) as well as widely used web browsers such as Google Chrome, Mozilla Firefox, Safari, and Microsoft Edge. The site will be created utilizing responsive web design techniques, enabling it to automatically modify its layout and interface according to the device’s screen dimensions. This method guarantees an ideal experience on desktop computers, tablets, and smartphones.

Employing web technologies such as HTML5, CSS3, and JavaScript ensures compatibility across platforms and facilitates dynamic, engaging content. To accommodate older browsers, progressive enhancement will be utilized, guaranteeing that essential features stay functional regardless of restricted browser functionalities. Moreover, the backend setup will be cloud-driven, employing services like AWS or Google Cloud to guarantee scalability and quick access around the globe. Content Delivery Networks will additionally be used to boost loading speeds, enhancing the user experience no matter the location. In conclusion, by prioritizing responsive design and compatibility across platforms, the website will provide a smooth experience on different devices and browsers.


3.4.2 Performance

The website must efficiently handle large data volumes and user interactions, ensuring a smooth experience even during peak usage. With 10,000+ songs and millions of user interactions, the platform must process queries and display results in real-time with minimal delays. Users should be able to search, filter, and label tracks quickly.
The backend must scale automatically as traffic increases, utilizing cloud services to ensure high availability. Efficient data retrieval will be achieved through optimized database queries and caching, reducing response times. Redundant systems and load balancing will maintain stability under heavy traffic.
AI features like mood-based recommendations must operate without delay. Data analysis should be completed quickly to deliver real-time suggestions. The system must remain responsive as data volume grows, ensuring users always have a fast experience.
By optimizing server-side performance, data storage, and AI processing, the website will remain responsive and reliable, offering a consistent user experience.


3.4.3 Usability

The website should be easy to use, enabling users to navigate smoothly and reach all functions with little effort. The interface ought to be straightforward and user-friendly, facilitating effortless song searches, filter application, and mood tag selection without any confusion. Important features must be apparent and readily accessible, with straightforward, actionable buttons directing users through every stage.

Search and filtering features must be fast and responsive, allowing users to locate songs according to their preferences without any lag. Tagging based on mood needs to be easy, allowing users to assign labels to songs that reflect the feelings they have while listening. Immediate feedback must be given, allowing users to view their choices and actions right away.

The website needs to be completely responsive, adapting effortlessly to various screen sizes, providing a seamless experience on both desktop and mobile browsers. The layout must adjust automatically, ensuring the interface remains tidy and user-friendly on different devices.

Managing errors is crucial for user experience; if users err, like choosing an incorrect mood tag, the system must offer straightforward and clear instructions for rectification. Moreover, the platform needs to provide easy-to-understand tips or guidelines, particularly for newcomers, enhancing the overall experience.

The goal is to develop a website that appears natural and pleasurable, enabling users to effortlessly engage with the platform to discover and connect with the content they seek.


3.4.4 Adaptability

The website needs to be adaptable to various user needs, preferences, and devices, ensuring a seamless experience across different environments. Since user behavior and device usage can vary, the platform should automatically adjust its interface to provide an optimal experience. It should accommodate diverse screen sizes, from desktop monitors to smaller mobile screens, ensuring consistent functionality without compromising design or performance.
As the website grows and evolves, it should integrate new features and adapt to changes in technology. The system should be designed to incorporate future updates, keeping the platform current with trends in music discovery, user experience, and AI advancements. Regular updates should enhance the user experience by offering new functionalities and improved performance without causing disruptions.


3.4.5 Scalability

Scalability is crucial for accommodating a growing user base and expanding data. As the number of users, songs, and interactions increase, the platform must sustain performance levels. The infrastructure should be adaptable to handle increased traffic, data, and queries. Resources need to be dynamically allocated based on demand to ensure stability during peak times. Cloud-based solutions enable scaling by expanding capacity during high traffic periods and reducing it during low periods, thus balancing cost and performance.
Data storage systems must efficiently handle growth, retrieving information promptly and minimizing delays. Implementing caching can accelerate frequent queries and prevent slowdowns. The backend and AI systems must process larger datasets and provide recommendations swiftly. The platform should be capable of scaling to accommodate increased data as it evolves.
In essence, scalability ensures that the website can grow, meet increasing demands, and maintain a consistent user experience as it expands.


4. Test Plan And Result

PROJECT TEST PLAN

TABLE OF CONTENTS

  1. Introduction
  2. Features to be Tested
  3. Features Not to be Tested
  4. Item Pass/Fail Criteria
  5. References
  6. Test Design Specifications
  7. Detailed Test Cases

4.1. INTRODUCTION

4.1.1 Version Control

Version Description of Changes Date
1.0 First Version March 2025

4.1.2 Overview

This document defines the test plan for the EmoTagger web application. The project is a crowdsourced emotion tagging platform that allows users to listen to music and tag the emotions they experience. It includes a gamified interface, community engagement features, and an analytics dashboard for data tracking.

4.1.3 Scope

This test plan includes the testing scope, test strategy, design specifications, pass/fail criteria, detailed test cases, and schedule for GUI, admin panel, user interaction forms, and feedback system.

4.1.4 Terminology

Acronym Definition
GUI Graphical User Interface
ADM Admin Dashboard
FORM Form Validation

4.2. FEATURES TO BE TESTED

  • Graphical User Interface (GUI)
  • Emotion Annotation Module
  • Gamification System
  • Admin Dashboard
  • Form Validation
  • Music Player Controls
  • Navigation Bar
  • Point-Based Leaderboard
  • Admin Panel Navigation
  • CSV export functionality

4.3. FEATURES NOT TO BE TESTED

  • External API for music recommendation (handled by third-party)
  • Payment or donation-related systems
  • Badge system
  • Multi-Language Support

4.4. ITEM PASS/FAIL CRITERIA

A test case is considered passed if the actual output matches the expected output under the given test conditions. It is failed if any mismatch, error, or unexpected behavior is observed.

4.4.1 Exit Criteria

  • 100% of the test cases are executed
  • 95% of the test cases passed
  • All high and medium priority test cases passed

4.5. REFERENCES

  • EmoTagger_SRS_V1.0 – February 2025
  • EmoTagger_SDD_V1.0 – February 2025

4.6. TEST DESIGN SPECIFICATIONS

4.6.1 Gamification System

4.6.1.1 Subfeatures

  • Earn Points for Tagging (GAM.POINTS)
  • Display Leaderboard (GAM.LBD)

4.6.1.2 Test Cases

TC ID Requirement Priority Scenario Description
GAM.POINTS.01 Tag Counting High User tags 1 song, verify point increment
GAM.LBD.01 Leaderboard Display Medium Leaderboard shows user’s rank after tagging

4.6.2 Form Validation

4.6.2.1 Subfeatures

  • Registration Form (FORM.REG)
  • Login Form (FORM.LOGIN)
  • Empty and Invalid Fields (FORM.INVALID)

4.6.2.2 Test Cases

TC ID Requirement Priority Scenario Description
FORM.REG.01 Required Fields High Submit form with empty fields, errors displayed
FORM.REG.02 Email Format Medium Enter invalid email, system rejects input
FORM.LOGIN.01 Wrong Credentials High Login with wrong password, error shown
FORM.INVALID.01 Prevent Double Submit Medium Double-click submit sends only one request

4.6.3 Graphical User Interface (GUI)

6.3.1 Subfeatures

  • Start Tagging (GUI.STRT_BTN)
  • Submit Emotion (GUI.SUB_BTN)
  • View Profile (GUI.PROF_BTN)
  • Music Player Controls (GUI.MUSIC_BTN)
  • Navigation Bar (GUI.NAV)

4.6.3.2 Test Cases

TC ID Requirement Priority Scenario Description Status
GUI.STRT_BTN.01 Start Tagging Page High Open homepage, click "Start Tagging" button Pass
GUI.SUB_BTN.01 Submit Emotion Tag High Select emotion and click submit while listening Pass
GUI.MUSIC_BTN.01 Music Playback High Play, pause, skip, and return music controls Pass
GUI.NAV.01 Navigation Function High Click each nav bar item and verify redirection Pass

4.6.4 Admin Panel

4.6.4.1 Subfeatures

  • User Management (ADM.USR)
  • View Statistics (ADM.STATS)
  • Review Tag Logs (ADM.REV_TAGS)

4.6.4.2 Test Cases

TC ID Requirement Priority Scenario Description Status
ADM.USR.01 Deactivate Account High Admin disables user and test login fails Pass
ADM.STATS.01 View Tagging Stats High Access analytics and confirm data chart Pass
ADM.REV_TAGS.01 Review Tag Logs Medium Admin filters and views historical tags Pass

4.6.5 General Features

4.6.5.1 Subfeatures

  • Point System (GEN.POINTS.01)
  • Leaderboard Display (GEN.LEADERBOARD.01)

4.6.5.2 Test Cases

TC ID Requirement Priority Scenario Description Status
GEN.POINTS.01 Tagging, Point Rule High Tag a music and verify points increase Pass
GEN.LEADERBOARD.01 Leaderboard View Medium Check if users are ranked based on points Pass

4.7. DETAILED TEST CASES

.4.7.1 ADM.USR.01

Field Description
Test ID ADM.USR.01
Purpose Validate that the admin can deactivate a user account
Requirements Admin User Management
Priority High
Estimated Time 3 minutes
Setup Admin is logged in, test user is active
Procedure Navigate to User Management → Select user → Click 'Deactivate' → Try login
Expected Result Login fails, user is marked as inactive
Cleanup Reactivate user if needed

4.7.2 GUI.MUSIC_BTN.01

Field Description
Test ID GUI.MUSIC_BTN.01
Purpose Verify music player buttons work correctly
Requirements Music Playback
Priority High
Estimated Time 3 minutes
Procedure Play → Pause → Next → Previous
Expected Result Music controls respond accordingly
Cleanup None

4.7.3 GAM.LBD.01

Field Description
Test ID GAM.LBD.01
Purpose Ensure leaderboard updates based on user activity
Requirements Gamification - Leaderboard
Priority Medium
Estimated Time 3 minutes
Procedure User A tags more than User B → View leaderboard
Expected Result User A is ranked higher than User B
Cleanup None

.4.7.4 GUI.STRT_BTN.01

Field Description
Test ID GUI.STRT_BTN.01
Purpose Verify that the tagging interface appears when 'Start Tagging' is clicked
Priority High
Estimated Time 2 minutes
Setup User is logged in and on the homepage
Procedure Click 'Start Tagging' → Verify tagging panel appears
Expected Result Emotion tagging panel appears successfully
Cleanup Return to homepage

4.7.5 FORM.REG.01

Field Description
Test ID FORM.REG.01
Purpose Validate that empty registration form returns proper errors
Priority High
Estimated Time 2 minutes
Procedure Leave all fields blank → Click 'Register'
Expected Result Error messages are displayed
Cleanup Refresh form

4.7.6 ADM.EXPORT.01 (Not Implemented – Optional)

Field Description
Test ID ADM.EXPORT.01
Purpose Verify admin can export data to CSV
Priority High
Estimated Time 3 minutes
Procedure Admin Dashboard → Click 'Export CSV'
Expected Result File download is triggered
Cleanup Delete test file if needed

4.7.7 GEN.POINTS.01

Field Description
Test ID GEN.POINTS.01
Purpose Verify user earns points when tagging music
Priority High
Estimated Time 2 minutes
Procedure Tag a song → Check profile or database
Expected Result Points increased
Cleanup Reset points if needed

4.7.8 GEN.LEADERBOARD.01

Field Description
Test ID GEN.LEADERBOARD.01
Purpose Confirm leaderboard ranks users correctly
Priority Medium
Estimated Time 3 minutes
Procedure Users tag varying amounts → View leaderboard
Expected Result Higher tag count = higher rank
Cleanup None

4.7.9 ADM.PANEL.01

Field Description
Test ID ADM.PANEL.01
Purpose Ensure admin can access all panel sections
Priority High
Estimated Time 2 minutes
Procedure Log in as admin → Navigate through Dashboard, Users, Musics
Expected Result All views accessible with no errors
Cleanup Logout

TEST RESULTS

TC ID Feature/Subfeature Priority Description Result
GAM.POINTS.01 Tag Counting High User tags 1 song, verify point increment Pass
GAM.LBD.01 Leaderboard Display Medium Leaderboard shows user’s rank after tagging Pass
FORM.REG.01 Required Fields High Submit registration form with empty fields, error messages appear Pass
FORM.REG.02 Email Format Check Medium Enter invalid email, system rejects input Pass
FORM.LOGIN.01 Wrong Credentials High Try to login with wrong password, system displays error Pass
FORM.INVALID.01 Prevent Double Submit Medium Rapid double-click on submit only sends one request Pass
GUI.STRT_BTN.01 Start Tagging Page High Open homepage, click 'Start Tagging' button Pass
GUI.SUB_BTN.01 Submit Emotion Tag High Select emotion and click submit while listening Pass
GUI.MUSIC_BTN.01 Music Playback High Play, pause, skip, and return music controls Pass
GUI.NAV.01 Navigation Function High Click each nav bar item and verify redirection Pass
ADM.USR.01 Deactivate Account High Admin disables user and test login fails Pass
ADM.STATS.01 View Tagging Stats High Access analytics and confirm data chart Pass
ADM.REV_TAGS.01 Review Tag Logs Medium Admin filters and views historical tags Pass
GEN.POINTS.01 Point System High Tag a music and verify points increase Pass
GEN.LEADERBOARD.01 Leaderboard View Medium Check if users are ranked based on points Pass
ADM.PANEL.01 Admin Navigation High Admin can access Dashboard, Users, Musics Pass

5. Conclusion

The EmoTagger project is an inventive community-based method for understanding emotional bonds people establish with music. Using crowdsourcing and gamification, the platform effectively collects emotional data from diverse user groups, resulting in a strong and culturally varied dataset. The integration of AI-driven analytics ensures that the collected data is efficiently processed and utilized to enhance user experiences through personalized music recommendations and emotional trend analysis.

The platform's focus on user-oriented design highlights availability, collaboration, and inclusivity, allowing users to engage in emotional tagging while experiencing an interactive and fulfilling journey. The addition of gamification features, including points, badges, and leaderboards, acts as a strong incentive, ensuring ongoing user engagement and enhancing dataset quality.

From an academic and research viewpoint, EmoTagger offers important insights into the emotional and psychological effects of music, easing future studies in areas such as music therapy, psychology, and artificial intelligence. The data privacy and security measures of the platform ensure the careful handling and safety of user information, maintaining trust and transparency.

In conclusion, EmoTagger functions not only as a medium for labeling music based on emotion but also as an engaging space for exploring and evaluating emotions. It links technology, psychology, and music, offering a flexible and scalable tool with significant potential for both personal and research applications. As the platform evolves, it is prepared to offer substantial input to emotional AI, music technology, and serve as a kind of third-eye tool for tagging and evaluating music data and user experience design. This framework could potentially extend to process other kinds of data via the platform.


6. Project Work Plan

workplanceng408

  • Figure 35: Sequence Diagram for Song and Emotion Data Analysis

Project Work Plan on GitHub


7. References

  1. K. Beck et al., "Manifesto for Agile Software Development," Agile Alliance, 2001. [Online]. Available: https://agilemanifesto.org/

  2. S. Deterding, D. Dixon, R. Khaled, and L. Nacke, "From game design elements to gamefulness: defining 'gamification'," in Proceedings of the 15th International Academic MindTrek Conference: Envisioning Future Media Environments, 2011, pp.9-15.

  3. J. Howe, "The rise of crowdsourcing," Wired Magazine, vol. 14, no. 6, pp. 1-4, 2006.

  4. R. Cowie et al., "Emotion recognition in human-computer interaction," IEEE Signal Processing Magazine, vol. 18, no. 1, pp. 32-80, Jan. 2001.

  5. "General Data Protection Regulation (GDPR)," European Union, 2016. [Online]. Available: https://gdpr-info.eu/

  6. J. Howe, "The rise of crowdsourcing," Wired Magazine, vol. 14, no. 6, pp. 1-4, 2006.

  7. R. Cowie et al., "Emotion recognition in human-computer interaction," IEEE Signal Processing Magazine, vol. 18, no. 1, pp. 32-80, Jan. 2001.


⚠️ **GitHub.com Fallback** ⚠️