Game Design Document - Carl-Nelson/CapstoneProject GitHub Wiki
High-Level Concept/Design
Working Title: Ghost hands
Concept Statement: Learn ASL basics in VR and enhance your skills with a series of ASL games before stepping out into the world of ASL VR chat where you can build friendships with other ASL community members.
Genre: Learning game
Target Audience:
- All Ages
- Deaf community
- Family/friends of someone in the ASL community
- Anyone who wants to learn and practice ASL
Unique Selling Points:
- Small number of ASL VR learning games on the market
- Opportunity to grow the ASL community by teaching people ASL and giving them the ability to connect with other people in the ASL community
- ASL “world” in the metaverse
Product Design
Product Description:
We will use 26 3D hand models to represent each letter of the alphabet. The application will be used solely with hand tracking. The user will be able to grab the hand model and move it/rotate it to get a complete look. With the hand tracking and VR headset camera, the user will copy the sign they see on the screen with their own hand. When the computer recognizes that the correct hand sign has been made, it will automatically move on to the next letter of the alphabet. In the learning mode, it will allow unlimited attempts to get the correct sign. Users will be able to manually select to move on to the next letter as well. A test mode will allow the user to test the knowledge they have learned from learning mode. The test mode will allow three failures before requiring a restart. The stretch goal for this product will be to introduce a game aspect to learning.
Player Experience & Game POV:
Player: Represented by a set of hands
Setting: minimalist environment
Visual & Audio Style:
Visuals: 3D hand representation, semi-empty room with little distractions
Audio: Minimal audio – ding & buzzer for right and wrong answers, level up audio and haptic feedback (single and double vibrations for wrong and right answers)
Platforms/Technology: Oculus Quest, 3D, Unity
**Assets: **
- Avatar
- First person avatar view
- 3D hand positing for the ASL alphabet
- 3 scene backgrounds
- Fanfare
**Sound **
- Ding (for correct positioning)
- Buzzer (for incorrect positioning)
- Level up sound effect
- Level up haptic effect
Scope:
**Team (7) / Roles: **
**4 Application Development Students ** Amber Dulz (Product Owner) Alejandra Valencia (Project Manager) Darren Ross (Scrum Master) Carl Nelson (GitHub Merge Manager)
**2 Business Intelligence Students ** Xianxian Wang (BI Lead) Sophie Conroy (Admin)
**Digital Media Art Student **
Project Timeline:
Project Duration: April 4th – June 10th (approximately 2 months)
Project Schedule Outline:
**Project launch - 1 month (March 8 - April 4) **
- Create an dev environment
- Collect the basic set of assets
- Create a sprint schedule
- Finalize documentation (GDD, BI docs, etc.)
**Development - 2 months (April 4th - June 4th) **
- Write a script to compare hand positioning
- Game loops
- QA - 2 weeks (May 30th - June 10th)
- Bug fixes
- Presentation deck
Risks
- Project deadline is not met
- Product cannot be built as designed
Game System Design
**Core Loops ** Tutorial Learning mode Test mode
**Core Gameplay Mechanics Brief **
**Learning Mode ** Computer will generate a 3D hand model of ASL letter. User will mimic displayed 3D letter with their own hand, via hand tracking. Computer will move on to the next letter once it recognizes user has signed the correct letter.
**Test Mode **
Computer will display a letter of the alphabet
User will perform the hand and finger positioning for the letter in ASL
User can have 3 failed attempts (under a specific amount of time) in test mode before it requires a restart
**Game Mode (stretch goal) **
Computer displays 3D hands of a ASL letters that moves toward the user.
User needs to sign the correct letter as it gets closer to them to get points.
If user does incorrect hand sign, user will lose points.
Player can adjust the speed of the letters being displayed
**Influences **
Demo by Daniel Beauchamp: Oculus Quest Networked Hand Tracking Points To Future Of Social VR (uploadvr.com)
This is an influence for our game because the developer uses Oculus hand tracking (and Normcore --a multiplayer for unity) to develop a platform to play rock paper scissors using the most basic avatar and scene.
Using ASL in virtual reality: Sign Language In VR ‘Worth Exploring’ As Hand Tracking Improves (uploadvr.com)
The person in this video is an ASL teacher that explains the differences between ASL in real life and in virtual reality. Players are equipped with full-body avatars and use facial recognition and hand tracking to speak ASL. In the second video, 3 people (1 deaf 2 hearing) are speaking ASL in a VR room.
VR ASL with Hand Tracking: Learn Sign Language with Virtual Reality and Hand Tracking – LearnVR.org
College student creates German Sign Language training program for Oculus Quest platform