Juicy UI Testing and Validation - UQdeco2800/2022-studio-1 GitHub Wiki

Initial User Testing to Review Sprint 3 Guidebook Efforts

Description:

To understand how potential users would find the experience of the guidebook, we conducted user testing to gather improvements. We hope to implement this in the game in sprint four as part of the finalisation of the feature. Note demographic is Australian teenagers varying from ages 12-16 who have sufficient gaming abilities/experience.

Testing Plan:

The testing plan outlines the observational approach, which was then followed up by questions. Observational testing allows us to see how users interact with the game and then employ the guidebook in a natural setting. The following is the outline for the user testing:

  1. Users play the game as per usual.
  2. End testing after the user interacts with the guidebook for a sufficient time.
  3. Question specific actions or moments observed in the guidebook interaction.

Results:

User A:

  • Observation Recording: The user started the game by pressing start, spent ten minutes on the main game screen, and then returned to the start page. They clicked all available icons and then returned to the game, where they clicked the guidebook. They flicked through the pages and read about the enemies; after five minutes of reading the page, we ended the testing there.
  • Follow-up Question and Response: Why did you return to the start page after playing the game? "I was looking for the instructions as most games I've played in the past have a section near the settings where you can get tips or instructions, so I thought I'd find the guidebook there to get information about the enemies".

User B:

  • Observation Recording: Unlike the other test participants, the user started by clicking the guidebook and reading through each page. Since the user hadn't had that much time in the test, we extended the testing time for an extra ten minutes. The user's a slightly disgruntled facial expression when looking at the buildings after they opened the shop.
  • Follow-up Question and Response: Why did you feel dissatisfied when looking at the buildings? "I guess I was confused because the guidebook had images different from the buildings in the game. Not that they need to be identical like symbols for buildings would do, but it was weird that they seemed like actual buildings in the game, but then they weren't".

User C:

  • Observation Recording: The user played the game for twenty minutes and then returned to the start page to exit, then clicked the guidebook at what seemed to be out of curiosity. They had a flick through to the fourth page but remained unimpressionable and then closed the game.
  • Follow-up Question and Response: Were there any recommendations you'd propose to improve the guidebook? "Yes, the overall interface was a bit dull, so you could add noises to the buttons to make it more engaging or add colour where possible to increase liveliness".

Outcome:

The observational testing revealed some potential tasks we can do in this sprint. This includes the following:

  • Move the guidebook access button to the start page and remove it from the main game screen.
  • Try and implement more colour into the game.
  • Add noises to the button interactions.
  • Change images to recent versions or another graphic.

User Testing for Music of Guidebook

Description:

The guidebook was identified as lacking atmospheric qualities and inducing a mysterious and compelling sense of the game due to the lack of sound. Music can influence emotion, delve deeper into the story, and make the game more immersive. The following user testing aims to identify which sound users would appeal when reading the guidebook, whereby the music does not interfere with the act of reading whilst still appealing to the game theme. Note: the music will be modified after the user testing outcome to match the game further.

Testing Plan:

Work through each task whilst looking at the guidebook screen and observe which track matches the screen's energy.

Task 1: Listen to music track 1 - Aftershocks

https://artlist.io/song/63912/aftershocks

Task 2: Listen to music track 2 – Insomniac’s Dream

https://artlist.io/song/84337/insomniac's-dream

Task 3: Listen to music track 3 – Birthday Suite

https://artlist.io/song/55267/birthday-suite

Track 4: Listen to music track 4 – Down the Rabbit Hole.

https://artlist.io/song/87587/down-the-rabbit-hole

Questions to Users Post Test:

Which music track appealed to you the most and why?

Results:

User 1:

  • Preference – Track 1 and 4.
  • Explanation: ‘The last option allowed you to process the guidebook information on the page easily and at the same time experience no anxiety/stressful related emotions about opening the book. Plus, it would not get annoying whilst the guidebook is open and if you have to refer back to it. Also, track 1 was also pretty good, so I’d consider it a preference, but I am leaning towards the last’.

User 2:

  • Preference – Track 1.
  • Explanation: ‘Track 1 is the sound you hear in most video games for some quest-related games where you have to check out something like a help page or logbook. Also, the music is not too suspenseful and has a great flow, which is essential as it sounds like it is sitting in the background and not interfering, but still instils a sense of fun.’

User 3:

  • Preference – Track 1:
  • Explanation: ‘Personally I prefer option one as it had a slight bop to it although it was still mysterious and chill. The other options felt a bit too suspenseful for an information-based page and just didn’t really match up when I was trying to read through the pages at the same time. I guess to elaborate it wasn’t in your face which is what I felt with the other tracks’.

User 4:

  • Preference – Track 1:
  • Explanation: ‘Straight off the bat, the rhythm and, dare I say, texture’ of the music was on point. To describe track one, I’d say it is playful, and you could say lightly playful. I still feel a part of the game, but in the guidebook part, I think the guidebook should be toned down from the main game as the music is heavier and quicker dynamically, which makes it more suspenseful.’

User 5:

  • Preference – Track 4:
  • Explanation: ‘I really enjoyed track four as it was personally relaxing. I liked how the tone started low and slowly grew, then dropped and followed a smooth pattern which was soothing. For this guidebook, I think it would make sense to add track four as you are making the user have to digest information that can be overpowering/overbearing and turn off the person from reading’.

Outcome:

The user testing was highly beneficial in identifying the best music to use as the base for the guidebook screen. There was an overwhelming preference towards track one, closely followed by track four. Interestingly, no users gravitated towards track two or three, as it appears it did not suit the guidebook and could have interrupted the engagement with the content. There was an overall notion that it is best if the music doesn’t disrupt the readability, so in the garage band application, the music in track 1 will be modified to cater further.

User Testing for Button Click Sound

Description:

Boosting the juiciness of the game can be achieved by adding sounds to the guidebook screen. From recommendations from fellow studio staff and members, it was advised to add button noises which, when pressed to get to the following page, make a slight noise, potentially increasing the juiciness levels. The subsequent user testing aims to identify which sound users would appeal to when clicking to the next page on the guidebook, whereby the sound suits the game theme and makes the interaction more engaging.

Testing Plan:

Task 1: In a group, brainstorm 10-15 descriptive words or expressive terms for button qualities. Consider both positive and negative descriptive words. The test coordinator will scribe these words on paper.

Task 2: Now the users must group these words in a positive or a negative group, and the tester coordinator will mark them green and red, respectively.

Task 3: Each user will listen to each sound recording with their eyes shut. Then repeat the sound recording when stimulating the environment of clicking the button to get to the next page.

Task 4: Then each user will draw 4 arrows from a descriptive term to a corresponding sound option based on their opinions of the sounds they heard.

Results:

image0 (8)

Outcome:

The winning criteria for the button selection are the most positive (green) arrow connections and, in turn, the least negative (red) connections. After the testing, the test coordinator observed the following:

  • Sound 1 -> 3 RED, 1 GREEN

  • Sound 2 -> 0 RED, 11 GREEN

  • Sound 3 -> 3 RED, 1 GREEN

As a result, sound two was ahead of the other two sound options by far, and the central reasoning behind this choice was that it felt game-like, playful, even, and clean and demonstrated a clear interaction within the game theme. Hence, sound 2 will be integrated into the gameplay.

User Testing for Flicking Page Sound

Description:

Boosting the juiciness of the game can be achieved by adding sounds to the guidebook screen. From recommendations from fellow studio staff and members, it was advised to sound to the flicking animation page of a book being flicked through to potentially increase the juiciness levels. The subsequent user testing aims to identify which sound users would appeal to when observing the page flicking animation in the guidebook, whereby the sound suits the game theme and makes the interaction more engaging.

Testing Plan:

Listen to each sound recording with your eyes shut. Then repeat the sound recording when stimulating the environment of flicking through pages.

Task 1: Listen to sound recording 1 with your eyes shut. Then repeat sound recording 1 when stimulating the environment of flicking through pages.

Task 2: Listen to sound recording 2 with your eyes shut. Then repeat sound recording 2 when stimulating the environment of flicking through pages.

Task 3: Listen to sound recording 3 with your eyes shut. Then repeat sound recording 3 when stimulating the environment of flicking through pages.

Questions to Users Post Test:

Which sound effect appealed to you the most and why?

Results:

User 1: "I preferred option two as it felt quite clean, and the timing was on point for flicking pages. I mean that it wasn't too quick or slow, just right in the middle."

User 2: "I enjoyed the second sound as it was very smooth. The other two sounds were not my favourite as the first one was quite sloppy and ended in an odd thud, and the last one also felt imbalanced, like inconsistent in rhythm."

User 3: "A slow book flick is the most satisfying in my eyes, so I'd go with option three on that one as the other two were too quick. Option two was the best following three, as it was the second slowest. My only concern with three is if it'd fit in the duration of the flip animation; it may need to be fastened up to accommodate".

Outcome:

Most testers agreed that option two was the best sound to use for flicking pages. It appears the speed has a flow-on effect on the smoothness of the flicking pages. Users seem to like a 'clean' and 'crisp' flicking of pages. Option two will be added to the guidebook as a result.

Final Guidebook User Testing

Description:

Throughout DECO2800, our team has found significant developments from navigational-based user testing. Navigational user testing trials the user's abilities to use the features we've implemented and is the final assessment of whether our work is of good quality and ultimately pleases our audience.

Since it is the last sprint, we have gone for more summarised user testing, with a large audience (10 people) to contribute but lesser personal input. The demographic ranges from young teenagers to young adults (12-18 years old) and a mix of both genders. See the testing plan for more details.

Testing Plan:

The following testing plan will outline the tasks users must complete in the game environment. All assignments are navigational-based, meaning the user must identify these functionalities themselves.

  1. Experience the loading screen and observe a fact about the game.
  2. Open up a building pop-up.
  3. Find the achievements page in the guidebook.

To document the testing results, the users must fill out the following table by giving a rating of each feature. The rating system is the Likert scale, where 1 is the lowest (bad rating), and 10 is the highest (good rating).

Results:

Outcome:

After reviewing the scores from ten users, our team is highly pleased with the feedback and high results. Our lowest score across all categories was an 8, which is still considerably high, with 10 being the highest score possible. Users agree that our game is in theme, and all features we've implemented are usable, intuitive, enhancing or helpful. The lowest total score was awarded to the helpfulness of the loading screen. However, that is expected as users aren't necessarily interacting with it directly as it is a by-product of the entire game's functionality. On the contrary, the highest awarded total score went to the building UI intuitiveness. Our team had many technical problems implementing this feature as it significantly depended on other groups. Still, it is great users find it useful in the game and easy to use. Overall, we are overwhelmed by the positive feedback from users, and we hope the entire game is enjoyable for players.