Interactive Projections - scottgarner/workshops GitHub Wiki
What Are We Talking About?
- Digital interactive experiences.
- Projected image for large scale display.
- Layer of interactivity for viewers/participants.
A Little History
(Some) User Interaction Options
- Computer Vision with various kinds of cameras.
- Physical Computing with custom hardware, sensors, etc.
- Realtime networking from phones or other devices.
- Async networking from social media, etc.
Computer Vision
- Basic "webcam" type cameras or specialized cameras (stereo, IR, etc.).
- Computer vision or image processing software.
- CV ranges from simple blob detection to complex skeletal tracking.
- Example: The Treachery of Sanctuary
Physical Computing
- Often tied to a microcontroller (Arduino, Teensy, etc.)
- Huge range of simple sensors.
- Many ways of communicating with software (keyboard emulation, serial, etc.)
- Example: Giant Joystick
Async Networking
- Pulls remote data from social networks, APIs, etc.
- Can accumulate over a long period of time.
- Again, works for remote and local users.
- Example: Tweetopia
Realtime Networking
- Touch data from multiple users.
- Accelerometer data or other phone sensors.
- Can mix in-person and remote users.
- Example below...
Hello CETI
A Few More Thoughts
- The tools and techniques for creating digital, screen-based content still apply. (Unity, p5, Touch Designer, etc.)
- Many interaction patterns still apply, but multi-user interaction often wins out.
- People love mirrors.
- Often the biggest challenges are related to robustness and reliability.
Easy Custom Input Hand-On