Literature Review - miel-uqac/XR_Nomadic GitHub Wiki
Promoting Movement in Office XR Environments via Subconscious Interventions
Introduction
Sedentary behavior in office settings is a well-documented health risk, contributing to musculoskeletal issues and metabolic problems. As virtual reality (VR) and mixed reality (MR) technologies become more common for work (e.g. virtual meetings, design, or remote collaboration), there is concern that extended XR use might further reduce physical activity. At the same time, XR platforms offer unique opportunities to embed physical movement into daily work in subtle ways. Rather than prompting explicit exercise breaks, researchers are exploring subconscious interventions – techniques that encourage users to move without requiring conscious effort. These include visual or spatial manipulations of the virtual environment, ambient cues that nudge posture or activity, and subtle game mechanics or interactions that incidentally promote movement. This literature review surveys recent academic research on such interventions in VR/MR office contexts. We summarize key findings on their effectiveness, categorize the types of techniques used, identify the XR platforms and technologies involved, describe targeted behaviors (from small posture shifts to walking), and note the populations studied. We also highlight gaps in current research and promising directions for future work.
Types of Subconscious Movement Interventions in XR
Spatial Manipulations and Visual Distortions
One class of interventions uses the plasticity of virtual environments to manipulate spatial perception in order to increase physical movement. VR allows altering distances, scales, or perspectives in ways that go unnoticed but change user behavior. Spatial distortion techniques have proven effective in encouraging extra movement. For example, manipulating virtual distances has been shown to make people walk more on a treadmill: in one study, participants walked in VR scenes where the environment was subtly “stretched” (distances slightly increased) or “compressed” (distances reduced) without their awareness [1]. The results were striking – walking distance was significantly affected by the manipulation. Users walked farther in the stretched-distance environment than in a normal-scale environment, and the least in the compressed environment [1]. Crucially, none of the participants noticed that the virtual hallway length had changed across conditions [1]. This demonstrates that scaling the virtual world can implicitly nudge users to take more steps, effectively increasing physical activity (in this case, walking endurance) without an explicit prompt. Such spatial tricks leverage the fact that in VR the brain accepts the altered geometry as real; by the time the user reaches a visual goal, they have unknowingly covered extra real-world distance. Beyond distances, other visual distortions can similarly encourage movement – for instance, redirected walking techniques that subtly rotate or shift the virtual camera view can cause users to physically turn or walk more than they realize, allowing continuous locomotion in confined spaces. These methods, originally devised to manage VR play space, can be repurposed as health interventions (e.g. increasing step counts) if integrated thoughtfully.
Visual manipulations can also target vertical movement and posture. Slight changes in perspective can make objects appear just out of reach or at a higher vantage, prompting users to adjust their physical position. For example, an XR system could gradually elevate the position of a virtual toolbar or document until the user instinctively stands up to see it better – a subtle way to induce a posture change. An experimental illustration of this approach appears in a study on “Body-Follows-Eye” in VR [2]. In that work, the content on a virtual screen was dynamically repositioned based on where the user was looking, with the goal of influencing body posture. By moving interface elements or the viewpoint ever so slightly, the system encouraged the user’s body to follow (e.g. sit up taller or center themselves) to maintain optimal view. Because the shifts were slow and within perceptual thresholds, users naturally adjusted their posture without feeling a deliberate correction. This approach overlaps with adaptive interfaces (discussed more below) and shows how virtual view manipulation can serve as a subconscious nudge.
Adaptive Virtual Workspaces and Posture Nudging
Closely related to spatial tricks are adaptive workspace interventions in XR – virtual environments or interfaces that actively adjust to user behavior in order to promote healthier postures and movements. A prominent example is dynamic repositioning of screens or content in VR to counteract slouching. Researchers have drawn inspiration from physical ergonomics: for instance, Shin et al. initially demonstrated with a physical robotic monitor that moving a screen very slowly can unobtrusively coax a person into sitting upright [3]. By moving the monitor at an imperceptibly slow speed, their system caused users to gradually straighten their backs and follow the screen’s motion, resulting in spontaneous posture corrections. Users did not consciously realize the monitor was “nudging” them.
Building on this idea, the same team applied it in VR by gradually shifting a virtual screen’s position and angle in front of the user [2]. The VR screen would subtly move if it detected the user’s viewpoint at a slouched angle, thereby encouraging the user to re-center their head and torso.
An independent study by McGill et al. took a related approach: they implicitly adjusted a virtual display based on the user’s head orientation, in an egocentric VR workspace, to reduce neck strain[4]. Notably, these adaptations yielded measurable benefits – McGill and colleagues found that such implicit screen movements minimized reported neck fatigue and discomfort while still granting the user access to a wide virtual desktop [4].
In immersive settings, analogous adaptive techniques can be purely software-driven. For example, a recent VR gaming study explored visual feedback interventions for head posture: if the user started to slouch or tilt their head down for too long, the game would respond with subtle visual changes to prompt correction. One design turned the scene grayscale whenever the user’s posture was poor, restoring color only when they sat upright – an ambient cue rather than an explicit warning [5].
Another design placed a floating virtual ball that would rise upwards in the user’s view; the user would naturally lift their head to keep the ball in sight, thereby resuming an upright posture [5]. These cues operate implicitly: the user just perceives a change in the environment (color desaturation or a moving object) and reacts by adjusting their body, without a direct on-screen message telling them what to do. In evaluations, such ambient visual cues were effective in reducing slouching – participants’ bad posture periods dropped, and several users preferred the more subtle cue (moving ball) because “it made me naturally look up, rather than having to interpret the meaning of an icon”, as one participant reported [5]. This underscores the advantage of implicit cues: they integrate into the experience more seamlessly than overt alerts, correcting posture “naturally” from the user’s perspective.
Another innovative approach in this category involves actuated furniture. While not purely virtual, it can be combined with XR use. Researchers Fujita et al., for instance, designed an intelligent office chair that very slowly tilts forward over time [6]. The gentle incline eventually makes sitting uncomfortable or unstable, essentially forcing the user to either shift position or stand up. Because the change is gradual, users often stand without fully realizing the chair was the cause.
Fujita et al. found that a slow seat inclination could successfully trigger workers to stand up from sitting, without disrupting their task performance [6]. This kind of physical nudge can be synchronized with VR or MR sessions – for example, during a long virtual meeting, the chair might imperceptibly rise to coax the user into a standing posture halfway through. These proactive interventions contrast with manual sit-stand desk adjustments because they do not rely on the user’s conscious decision: the environment initiates the posture change for them. However, careful calibration is critical – too fast or obvious a movement could alert or annoy the user. When done right, these adaptive workspace techniques demonstrate how XR systems (potentially coupled with IoT furniture) can implicitly guide healthier behavior, such as periodically switching between sitting and standing, maintaining ergonomic postures, or taking micro-breaks to move.
Ambient Cues and Implicit Nudges
Beyond directly manipulating the work apparatus, XR environments can incorporate ambient cues – subtle signals in the virtual surrounding – to encourage movement. These cues can be visual, auditory, or haptic, and are designed to influence the user’s behavior in the background of their primary task. Unlike an explicit notification (e.g. “Time to stand up!”), ambient nudges might come in the form of a small environmental change that the user responds to organically. For example, a VR office environment could simulate the phone ringing on a virtual desk across the room, prompting the user to stand and “answer” it (thus getting a break from sitting). Or the virtual lighting could slowly dim when the user has been idle too long, encouraging them to move (perhaps to trigger the lights back on). Such cues leverage normal reactions – the user moves because something in the scene draws them, not because they were overtly told to move.
Empirical research on ambient XR cues for movement is still emerging, but there are related findings. In social VR settings, one could use virtual avatars or agents as cues. Consider a remote VR meeting: a virtual colleague’s avatar might periodically stand or stretch; seeing this, other users might subconsciously mirror the action. This is rooted in social contagion and mirroring effects. While no study to date has explicitly quantified “avatar-induced” movement in office VR, it is known from virtual avatar research that users often mimic an avatar’s posture or gestures. For instance, if a virtual agent shifts its gaze or points somewhere, the immersed user tends to turn their head or body to follow, without thinking about it. Such mechanisms could be harnessed to break up monotony – e.g. a proactive virtual assistant could wander around the user’s space, causing the user to swivel their chair or walk a few steps to keep the assistant in view. These are implicit non-verbal cues embedded in the XR content that spur user motion.
A 2021 study on mixed reality agents noted that an agent’s locomotion and posture can influence users’ social perceptions and attention [7], suggesting that carefully designed agent behaviors might also influence user’s physical engagement (though this specific application remains to be tested).
Audio cues are another modality: XR systems can utilize spatial audio to nudge movement. For example, a subtle sound of an object dropping behind the user’s avatar might lead them to turn around (a reflexive response to sound location). In doing so, the user gets a neck and torso movement that relieves static posture. If such sounds are integrated into the virtual office ambience (say, an occasional distant phone buzz or a knock that causes the user to briefly look up and pivot), they can provide micro-breaks for the body with minimal mental distraction. Similarly, haptic nudges (e.g. a gentle vibration in a VR controller after 30 minutes of inactivity) could prompt the user to shift position reflexively – though haptic cues in isolation might be noticed as a system notification, so they are often combined with a virtual context (for instance, a controller vibration coinciding with the above-mentioned phone ring scenario, making it diegetic to the experience).
It’s worth noting that immersive environments can inadvertently cause certain movements, which designers might turn into health benefits. A curious finding is that simply wearing a VR headset with an engaging visual scene can increase natural body sway during standing.
A 2020 study showed that young adults standing still on a balance platform swayed significantly more when wearing an HMD showing a virtual scene than when not wearing VR [8]. The visual flow and depth cues in VR made them unconsciously adjust their posture more – a phenomenon akin to the well-known “moving room” illusion in balance research. While increased sway is not the same as intentional exercise, it indicates that the body responds to virtual stimuli in subtle ways.
In fact, controlled exposure to certain VR scenarios (like simulated heights or moving visual surroundings) has been found to induce more postural adjustments as the user maintains balance [9]. Future interventions might leverage this by designing VR work environments that ever so slightly challenge balance or orientation (within safe limits), thus preventing the user from remaining too static. For example, a virtual office could have a gentle “floating” effect or very slow oscillation, imperceptible as motion but enough that the user’s balance system stays active, causing continuous micro-adjustments of core muscles. These ambient techniques remain speculative but are grounded in the sensorimotor effects observed in VR. The key is to integrate them subtly so that they are part of the immersive experience, keeping the user unconsciously physically engaged.
Subtle Gamification and Embedded Exercise Mechanics
A highly promising approach to increasing movement in XR is through gamification – turning tasks into games or using game elements that incidentally require physical activity. VR “exergames” (exercise games) are an obvious example, but the focus here is on making the physical effort feel so fun or integrated that the user doesn’t register it as exercise. In the context of office XR use, this often means providing short, enjoyable VR experiences during breaks or even alongside work tasks that get people moving in an implicit way. Research has shown that immersive games can transform the perception of effort: users may achieve moderate-to-vigorous physical activity while feeling like they are merely playing. For instance, a Penn State study found that college students pedaling a stationary bike perceived the exercise as easier when using a VR cycling game, compared to traditional cycling, even though their actual exertion was the same or higher [10].
VR’s engaging visuals and interactivity distracted them from fatigue – “the sense of effort was much less than the effort actually expended” according to the researchers [10]. This reduced perceived exertion is a key benefit of gamified physical activity in XR.
In workplace settings, several studies have explored deploying VR games or playful VR applications to counter sedentary time. One notable 8-week field study set up a “VR game studio” in an office and allowed employees to play commercial VR games (like VR sports, rhythm and action games) during their breaks [11]. The games chosen (for example, a VR boxing game, a dance game, etc.) required full-body movements such as dodging, squatting, or swinging arms. The participants, who all had desk-bound jobs, ended up engaging in substantial physical activity through these play sessions. Key findings were very positive: all participants achieved beneficial levels of activity and mood improvement, and over half of them met recommended weekly fitness guidelines only because of the VR sessions (they would not have met them otherwise) [11].
In other words, integrating short VR play breaks enabled normally sedentary workers to attain ~150 minutes of moderate exercise per week, an important health milestone. Users also reported enjoying the experience; the novelty and fun of VR kept them coming back. Prior work by Yoo et al. (2017–2020) similarly found that VR exercise games can provide higher exertion than people realize and sustain engagement over time [11]. Table 1 lists some VR exergame interventions that have been studied with office workers and students.
Crucially, these VR games embed movement into compelling content – slashing targets with lightsabers (e.g. Beat Saber) or defending against VR zombies can require continuous squatting and lunging, yet users perceive it as entertainment, not a workout obligation. This aligns with the idea of subtle game mechanics: the game’s rules naturally demand physical responses.
For example, a subtle mechanic might be a VR puzzle that can only be solved by reaching high up (to encourage stretching) or by physically walking to different spots in the room to find clues. The user’s focus is on solving the puzzle; any exercise is a byproduct.
One prototype called “VRun” had users run in place to move their virtual avatar – testers ended up running considerable distances in place while just trying to win the VR game [12]. Adaptive difficulty can also be used: if a user has been too sedentary, the VR game could slightly increase the physical challenge (spawn an enemy that the user must dodge, etc.) to prompt movement, without the user realizing this was a health-motivated nudge.
A few research projects have explored MR (mixed reality) for implicit exercise in work contexts – for instance, an AR system for virtual classes that displays periodic prompts requiring students to physically mimic shapes or letters, thereby sneaking in activity during class [13].
| Study / Intervention (Year) | Technique Type (Implicit Strategy) | XR Platform / Technology | Targeted Movement Behavior | User Population | Key Findings / Effectiveness |
|---|---|---|---|---|---|
| **Cuperus et al., VR Distance Manipulation (2018)[1] ** | Spatial distortion of environment (stretched vs. compressed virtual hallway distances) | VR treadmill setup with HMD (immersive VR) | Walking endurance (distance on treadmill) | Patients with intermittent claudication (peripheral artery disease) | Increased walking distance when virtual distance was subtly increased; participants did not notice any difference in the environments. Longer virtual paths led to significantly more physical walking vs normal or shortened paths (p < .01). Demonstrated subconscious boost to exercise via visual manipulation. |
| Shin et al., “Slow Robot” Posture Monitor (2019) [3] | Unobtrusive posture correction by slowly moving a physical monitor (below perception threshold) | Not XR – physical robotic monitor arm (basis for XR concept) | Upright sitting posture; reducing slouching | Office workers (lab setting, n=12) | 4-hour trials showed frequent spontaneous posture corrections when monitor moved imperceptibly. Slouch duration dropped significantly and users reported less fatigue. Most did not realize the monitor caused the adjustments. Validated that subtle physical movement of the workspace can induce healthier posture. |
| Shin et al., “Body Follows Eye” in VR (2020) [2] | Gradual repositioning of virtual screen content to guide posture | VR HMD (Oculus Rift, etc.) with virtual desktop environment | Head and neck posture (sitting alignment) | Knowledge workers (experiment) | Demonstrated an XR equivalent of Shin 2019: a planar virtual screen that slowly shifts position/orientation in VR. Users’ bodies unconsciously followed the content, promoting neutral posture. Reduced neck strain was inferred, and technique was unnoticed by users. |
| McGill et al., Implicit VR Screen Realignment (2020) [4] | Egocentric virtual screen that auto-adjusts based on head orientation (implicit alignment) | VR HMD (immersive virtual workspace) | Neck angle and viewing ergonomics | Office software users (lab study) | Found that subtle, continuous adjustment of the virtual screen relative to the user’s viewpoint minimized neck discomfort. Users could access a wider virtual workspace without consciously repositioning, as the system “met them halfway.” Supports that invisible interface tweaks can maintain comfort. |
| Fujita et al., Auto-Tilting Chair (2019) [6] | Slow physical tilt of seat pan to prompt standing | Not XR by itself (smart office chair; can integrate with VR/MR) | Sit-to-stand transition (breaking prolonged sitting) | Office workers (simulation) | A slow (~hourscale) forward tilt caused users to stand up naturally to avoid sliding off, achieving a sit–stand transition without a prompt. Performance on their work task was not affected. Suggests feasible integration with VR setups: e.g. chair tilts during a VR meeting to get the user on their feet. |
| Dang et al., VR Posture Feedback “Moving Ball” (2025) [5] | Ambient visual cue in VR: a floating ball that encourages user to straighten up to keep it in view | VR HMD (gaming context) | Head and back posture (preventing slouch) | VR gamers (young adults, user study) | This subtle in-game mechanic effectively reduced slouching time. Many participants preferred it over explicit UI alerts, saying it felt natural (“made me look up instinctively”). Combined with slight color changes or icons, it improved posture during VR sessions. Shows that game elements can double as posture nudges. |
| Yoo et al., Workplace VR Exergaming (2020) [11] | Gamified exercise breaks: VR games (e.g. sports, rhythm) installed in office, voluntarily played during breaks | Room-scale VR (HTC Vive) with various games requiring movement | Moderate–vigorous physical activity (aerobic exercise, general movement) | Sedentary office workers (8-week field study, n=11) | Participants took ~10-minute VR play sessions (avg 3x/day). All achieved meaningful physical activity; 6 of 11 only met weekly PA guidelines thanks to the VR sessions. Also reported higher energy and improved mood. Highlights that fun VR content can integrate exercise into workday effectively. |
| Stoltzfus et al., VR Cycling vs Traditional (2024) [10] | Immersive VR biking (virtual scenery & gameplay) vs. normal stationary bike exercise | VR HMD (Oculus) paired with stationary bike (VZFit system) | Perceived exertion during aerobic exercise | Sedentary college students (n=22) | VR condition significantly lowered perceived effort in the first critical minutes of exercise. Heart rate and exercise metrics were similar, but students felt it was easier and more enjoyable in VR. This suggests XR can reduce psychological barriers to exercise by engaging attention elsewhere. |
| Singh et al., Active Class Prototype (2022) [13] | Augmented reality prompts during online classes (e.g. AR overlays that students must stand or move to interact with) | AR on mobile or HoloLens (exploratory study) | Light physical activity during otherwise sedentary screen time | Remote students (pandemic context) | Although preliminary, the AR interventions led to increased movement (standing, stretching) during virtual classes. Students found it novel and fun, though alignment with class content was important. Points to MR as a tool for embedding movement into daily routines like meetings or learning, keeping users active without leaving their virtual context. |
Research Gaps and Future Directions
Despite the progress, this domain is still developing, and several gaps and challenges remain:
• Long-term adherence and behavior change: Most studies so far are short-term (a few weeks or controlled experiments). It’s unclear if the effects persist and if users continue to engage with these interventions over months or years. Long-term follow-up studies are needed to see if, for example, workers habituate to the cues and start ignoring them, or conversely, if new healthy habits form (like routinely standing and stretching without needing cues). As one review pointed out, these approaches seem promising but “long-term follow-up of workers is still needed to confirm changes in habitual behavior”. Future research should also examine if the benefits (posture improvement, reduced discomfort, etc.) lead to measurable health outcomes over time (fewer injuries, better metabolic markers).
• Balancing obtrusiveness and effectiveness: The interventions must remain subtle enough not to disrupt work or annoy users, yet potent enough to elicit meaningful movement. Finding this balance can be tricky. For instance, if a virtual nudging cue is too gentle, some users might not respond (e.g. they might tune out a barely audible sound). On the other hand, if it’s too bold (like a very noticeable visual shift), it risks breaking the user’s immersion or concentration. Studies like Fujita’s have begun to map these thresholds (adjusting angle/speed of chair tilt to avoid being intrusive), but more work is needed to establish design guidelines for “just noticeable” interventions. Personalization may be key – systems could learn an individual’s sensitivity and adjust the intensity of nudges accordingly. Adaptive algorithms that increase stimulus gradually until a response is detected (and then use that as a baseline) are one future direction.
• Integration with workflow and productivity tools: For widespread adoption, movement-promoting features might be built into standard XR workplace applications (virtual desktop software, VR meeting platforms, etc.). Currently, many interventions are standalone prototypes. A promising direction is to collaborate with platform developers (like those making VR co-working software) to embed nudges as an optional feature. For example, an enterprise VR meeting app could have a “wellness mode” that automatically adds ambient movement cues if a meeting runs over an hour. Research on workspace awareness in XR [14] and context-aware interfaces can tie in here – the system can detect what the user is doing (typing, intense focus vs. idle, etc.) and choose appropriate moments to trigger a nudge (as Lee et al. did with task-switch triggers [15]). The goal is a smooth integration where the interventions feel like a natural part of the workflow (or even the narrative of the VR experience) rather than an external interruption.
• MR and real-world blending: Mixed reality offers a relatively unexplored avenue: because MR allows awareness of the real world, it might mitigate some safety concerns of VR (e.g. walking blindly). MR nudges could encourage movement within the actual office space in a controlled way. One could imagine AR glasses that not only remind you to move but visually highlight a path to walk for a few minutes, or spawn a virtual avatar that says “let’s walk to the window” and leads you there – giving you a brief walk without losing connection to reality. Some early prototypes and conceptual papers touch on this, but real-world trials are scant. Researchers will need to tackle issues like tracking reliability in AR, user acceptance of AR overlays during work, and ensuring privacy (since AR may involve cameras in workplace). Yet, MR could be key to bringing subtle interventions out of purely virtual contexts and into everyday offices (including those not using VR for work tasks).
• Multi-user and social factors: So far, interventions have focused on individual behavior, but offices are social environments. Introducing movement nudges in multi-user XR (like VR meetings) raises questions: Will people feel self-conscious to stand up if their avatar is the only one doing so? Could social dynamics be leveraged, such as designating a “virtual coach” role to one meeting participant who initiates a group stretch? Early evidence suggests XR can support more natural social interaction than video calls [16] – this could extend to group wellness activities. Future research might explore collaborative fitness breaks in VR, or competitive gamification among colleagues (who takes the most VR break steps today?). Conversely, it must ensure that one user’s intervention (e.g. their chair moving in MR) doesn’t inadvertently disturb others. Considering social acceptability and perhaps synchronizing nudges (so an entire team’s headsets subtly cue a break at the same time) could be valuable.
• Quantifying cognitive impacts: While physical outcomes are measured, we need more data on cognitive and performance effects. Do these movement breaks and posture improvements tangibly improve concentration, creativity, or fatigue levels in the afternoon? There is some evidence of reduced “Zoom fatigue” when using VR meetings [16], and exercise is known to boost mood and alertness. Studying XR nudges in situ could reveal secondary benefits like improved productivity or job satisfaction. On the flip side, researchers should also monitor for any negative cognitive impacts (e.g. if an intervention triggers at a wrong time and breaks flow, does it reduce work quality?). So far, results are reassuring with no major downsides reported [17] [6], but more rigorous assessment in real work scenarios would build confidence in these tools for employers. In conclusion, subconscious movement interventions in XR represent a novel interdisciplinary frontier – blending human-computer interaction, ergonomics, and health psychology. The groundwork laid by these studies shows that such interventions are not only feasible but effective at nudging behavior in positive ways. As XR technology matures and permeates the workplace, there is a timely opportunity to embed “health by default” into our virtual and augmented environments. Future research, addressing the gaps above, will help ensure these solutions are robust, personalized, and seamlessly integrated. Ultimately, the vision is for XR systems that serve not just as productivity or communication tools, but also as invisible wellness coaches – continuously and subtly helping users maintain healthy levels of movement and posture throughout their workday, all without pulling them out of the immersive flow of their tasks. The evidence so far provides plenty of reason to be optimistic about this vision becoming reality.
**References **
[1] A. A. Cuperus, A. Keizer, A. W. M. Evers, M. M. L. van den Houten, J. A. W. Teijink, et I. J. M. van der Ham, « Manipulating spatial distance in virtual reality: Effects on treadmill walking performance in patients with intermittent claudication », Comput. Hum. Behav., vol. 79, p. 211‑216, févr. 2018, doi: 10.1016/j.chb.2017.10.037.
[2] « Body Follows Eye: Unobtrusive Posture Manipulation Through a Dynamic Content Position in Virtual Reality | Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems ». Consulté le: 13 mai 2025. [En ligne]. Disponible sur: https://dl.acm.org/doi/10.1145/3313831.3376794 [3] J.-G. Shin et al., « Slow Robots for Unobtrusive Posture Correction », in Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, in CHI ’19. New York, NY, USA: Association for Computing Machinery, mai 2019, p. 1‑10. doi: 10.1145/3290605.3300843.
[4] M. Mcgill, A. Kehoe, E. Freeman, et S. Brewster, « Expanding the Bounds of Seated Virtual Workspaces », ACM Trans Comput-Hum Interact, vol. 27, no 3, p. 13:1-13:40, mai 2020, doi: 10.1145/3380959.
[5] M. Dang, D. Luong, C. Napier, et L. Kim, Co-Design & Evaluation of Visual Interventions for Head Posture Correction in Virtual Reality Games. 2025, p. 15. doi: 10.1145/3706598.3713177.
[6] « TiltChair: Manipulative Posture Guidance by Actively Inclining the Seat of an Office Chair | Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems ». Consulté le: 13 mai 2025. [En ligne]. Disponible sur: https://dl.acm.org/doi/10.1145/3411764.3445151
[7] Z. Chang, Cao ,Jiashuo, Gupta ,Kunal, Bai ,Huidong, et M. and Billinghurst, « Exploring the Effects of Mixed Reality Agents’ Locomotion and Postures on Social Perception Through a Board Game », Int. J. Human–Computer Interact., vol. 0, no 0, p. 1‑19, doi: 10.1080/10447318.2024.2435694.
[8] L. F. I. Imaizumi et al., « Virtual reality head-mounted goggles increase the body sway of young adults during standing posture », Neurosci. Lett., vol. 737, p. 135333, oct. 2020, doi: 10.1016/j.neulet.2020.135333.
[9] B.-R. Jian, Y.-H. Hwang, et H.-W. Liang, « Influence of virtual heights and a cognitive task on standing postural steadiness », Int. J. Ind. Ergon., vol. 100, p. 103553, mars 2024, doi: 10.1016/j.ergon.2024.103553.
[10] « Virtual reality may help reduce perceived efforts while exercising | Penn State University ». Consulté le: 13 mai 2025. [En ligne]. Disponible sur: https://www.psu.edu/news/berks/story/virtual-reality-may-help-reduce-perceived-efforts-while-exercising
[11] S. Yoo, P. Gough, et J. Kay, Embedding a VR Game Studio in a Sedentary Workplace: Use, Experience and Exercise Benefits. 2020, p. 14. doi: 10.1145/3313831.3376371.
[12] S. Yoo et J. Kay, « VRun: running-in-place virtual reality exergame », in Proceedings of the 28th Australian Conference on Computer-Human Interaction, in OzCHI ’16. New York, NY, USA: Association for Computing Machinery, nov. 2016, p. 562‑566. doi: 10.1145/3010915.3010987.
[13] M. Singh et R. Peiris, Designing Augmented Reality Based Interventions to Encourage Physical Activity During Virtual Classes. 2022. doi: 10.1145/3491101.3519749.
[14] R. Assaf, D. Mendes, et R. Rodrigues, « Cues to fast-forward collaboration: A Survey of Workspace Awareness and Visual Cues in XR Collaborative Systems », Comput. Graph. Forum, vol. 43, no 2, p. e15066, 2024, doi: 10.1111/cgf.15066.
[15] B. Lee, S. Wu, M. J. Reyes, et D. Saakes, « The Effects of Interruption Timings on Autonomous Height-Adjustable Desks that Respond to Task Changes », in Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, in CHI ’19. New York, NY, USA: Association for Computing Machinery, mai 2019, p. 1‑10. doi: 10.1145/3290605.3300558.
[16] N. Held, M. Soeter, S. van Gent, N. Wiezer, G. Loots, et O. Niamut, « Immersive gathering: insights into virtual workplace meetings », Front. Virtual Real., vol. 5, sept. 2024, doi: 10.3389/frvir.2024.1391662.
[17] C. Breazeal, A. Wang, et R. Picard, « Experiments with a robotic computer: body, affect and cognition interactions », in Proceedings of the ACM/IEEE international conference on Human-robot interaction, in HRI ’07. New York, NY, USA: Association for Computing Machinery, mars 2007, p. 153‑160. doi: 10.1145/1228716.1228737.