Cutscenes, End Game Screen, Tutorial User Testing - UQdeco2800/2022-studio-3 GitHub Wiki

Introduction

In this sprint, we verified functionality and evaluated art-style with the help of User Testing. We primarily focused on two methods: Think Aloud and Time On Task. The User Testing was carried out with 12 people, all of whom provided verbal consent. Moreover, we aimed to reach a broader audience using a TAM inspired form that showed static images of the game. This information will be used to improve the UI of the game in future sprints which will enhance the user experience.

User Testing Plan

The User Testing will be divided into two main parts:

  1. Primary User Testing: The primary method of user testing involved one on one interaction with potential users. We chose a target audience of mostly users who play games more than once a month (and have never played this specific game). We were able to evaluate 12 users in total. In this case, there were overlaps with users who did not play games (2 of 15) and users who were in this studio (3 of 15), labelled our fringe users. We chose the Think Aloud and Time On Task methods of User Testing as these provided a good evaluation of UI faults and also helped identify any glaring design issues. To prevent bias, we split this user group into two parts of six, that is, 6 people were assigned to think aloud evaluation and 6 people were assigned to time on task evaluation. The responses to the primary user testing helped uncover the main errors with the game functionality and was our main source of feedback from user testing.

  2. Secondary User Testing: The secondary method of user testing was a TAM inspired Google Survey that, with the help of an attached document, established the functionality being tested and asked the user to rate specific parts of the functionality. For specific comments regarding functionality, a text-box was accomppanied with every User Testing section (Cutscenes, End-Game Screen, Tutorial). This ensured a more thourough understanding of our users, hopefully defining a more refined problem space. The aim of a form was to reach a broader audience beyond what we could cover with one on one testing. In this case, we recieved 3 responses. THe secondary user testing helped uncover more subtle, less glaring errors and acted as a secondary source of feedback (feedback that supports the primary feedback, to be evaluated with less rigour but with an equal amount of thought).

Link to google form: https://docs.google.com/forms/d/e/1FAIpQLSc7jlJLtNUOSCuNELtEhEnz8Bh9Cq2gSHfJZoqrc2g9fdRWhw/viewform

Link to User Testing observations: https://docs.google.com/document/d/12GfLlyDjvugUzLAWqzpD2gmQHAfnNBn-F7bpIpH_06I/edit?usp=sharing

Main Points of Interest

Cutscenes

Our main area of interest was how well the art and dialogue conveyed the story-line. We wanted users to feel excited and intrigued about the game-play and the cutscenes, hence, had to have a serious tone. Our use of line-art and a black and white colour pallete was pushed forward with these goals in mind, however, we were unsure if a user would understand the significance of these art choices. Moreover, the positioning of the next, skip and previous buttons were purely intuitive. We wanted this positioning to be verified by our users before we proceeded to further style the buttons.

Tutorial

The tutorial screen's main objective was to help a new user understand gameplay. The tutorial was a brand new feature being implemented this sprint, hence, we were didn't entirely know what response to expect from the users. Our main objectives with this new feature was to ensure the user understands the tutorial, feels ready to play the game and does not get distracted by the art style. We left this part of the User Testing more open to comments to understand the general thoughts/feelings behind the tutorial.

End-Game Screen

The end-game screen's current iteration is static. Hence, our main goals here was to verify the art style and ensure it fit in with the rest of game-play.

User Testing Results

Primary User Testing

Introduction

We gathered 12 responses, 6 in time on task and 6 in think aloud. We ensured the user group was a mix of people who enjoyed gaming and people who did not game often to remain as unbiased as possible.

Response Evaluation

Time on Task image image

Think Aloud image image

Secondary User Testing

Introduction

We were able to gather 21 reponses in our Google Survey. We established that using these reponses as our secondary source of feedback was important because we couldn't further ask a user how we could improve or what specific errors were the issue, we also didn't know the user's feelings and thoughts at that specific time. Regardless, we tried to make the questions as open ended as possible and took in written feedback wherever we could.

Response Evaluation

Users were asked to rate specific functionality on a scale of 1 to 7 (with 1 being "Highly Disagree" and 7 being "Highly Agree"). Every section was followed by a textbox, allowing for user to share any thoughts (and help us understand why the results given were as such). The functionality was displayed in a linked Google Document for new users to view.

Cutscenes

  1. The Cut-scenes help me understand the game's story and set the context well. image
  2. I understand the language/ artwork in the Cut-scenes. image
  3. The positioning of the skip/ previous/ next buttons is intuitive. image
  4. I enjoyed reading the Cut-scenes. image
  5. Is there anything we could improve about the Cut-scenes? image image

The cutscene feedback was mainly focused around general button styling, in line with the primary user testing feedback. We should also pay close attention to the load times and provide more context, character-wise.

Tutorial

  1. I understood what the tutorial was trying to convey. image

  2. After looking at the tutorial, I feel comfortable enough to play the game. image

  3. The art-style of the tutorial (and any related characters) was relevant to the game. image

  4. Is there anything we could improve about the tutorial. image image

The tutorial feedback was in-line with the feedback from primary user testing, people didn't completely understand the game-play or design choices.

End Game Screen The end-game screen was relevant to the rest of the the game. image The art style and language of the end-game screen made sense. image Is there anything we could improve about the end-game screen? image The End Game Screen Feedback was also in-line with the feedback from primary user testing, people were skeptical of specific design choices.

Reflection

Overall, users don't seem to fully understand the tutorial screen or game play. This may be because the tutorial is newer functionality that hasn't been fully implemented. In the future, we need to ensure that the basic functionality is covered so the users can focus on less glaring issues. This is something to do next sprint. The cutscenes have generally good feedback but the buttons are something we should look towards improving. Currently, the basic box boy buttons are present in the cutscenes which takes away from the scene in general. Perhaps, black and white image buttons would work better in this context, something that could match the black and white colour scheme of the cutscenes while also not taking attention away from the images. The end game screen's art style was made to reflect a user's psyche. Red triggers failure so red was a predominant colour in this scene. However, the users don't think that this style matched with the rest of the game (which has a more dark colour palette). We should perhaps look to maintaining consistency as feedback suggests rather than trying something new.

Next Sprint

In general, we realised that with a few tweaks, cutscenes were successfully implemented and could be brought to a close early next sprint. Before then, however, our aim with the cutscenes should be:

  • Design Buttons, ideally image buttons that fit in with the rest of the cutscenes.
  • Adjust button positioning so that no text is covered.
  • Re-test some designs with other users to get a better idea of how we could go about improving the art style. The tutorial screen, requires a lot of work and should be our main aim next sprint. We should focus on:
  • Positioning of the implemented sprite.
  • Expanding the tutorial to all game-play aspects
  • Making everything more interactive Moreover, we should aim to conduct user testing after implementing these features to ensure they align with what our users want and decide on further implementation using the feedback received then. The end game screen was a relatively new feature but comparitively less intense (compared to the tutorial). So, the feedback should be accounted for by next sprint:
  • Ensuring a more consistent colour palette with the main menu screen
  • Animating the background Based on the general comments received, we should also aim to improve the UI of the game.