Accessibility and Distance Perception in Virtual Reality Head-Mounted Displays for People with Dyspraxia

Preprint 2021

Virtual reality (VR) has great potential as a technological intervention for disabled people. However, most human factor research into VR does not consider people with motor learning disabilities. Here, we consider the accessibility challenges faced by people with dyspraxia when using the current generation of VR head-mounted displays.  Our work consists of two studies. The first is an exploratory workshop where people, both without and with dyspraxia, tried two commercial VR applications. This study showed dyspraxia sufferers had a greater difficulty in perceiving distance, leading to a higher level of mental workload. To investigate ways to mitigate this difficulty, we conducted a second lab-based experiment, exploring how readily adjustable physical features of current VR systems (interpupillary distance, the distance between the eyes and lenses in the headset) could be used to lower mental demands and improve accessibility. The outcome is used to propose design guidelines for future VR systems.
This work builds on our previous projects [1-7].

References

[1] Cho, Y., Julier, S. J. and Bianchi-Berthouze, N., 2019. Instant Stress: Detection of Perceived Mental Stress Through Smartphone Photoplethysmography and Thermal Imaging. JMIR mental health, 6(4), e10140.

[2] Cho, Y., 2021. Rethinking Eye-blink: Assessing Task Difficulty through Physiological Representation of Spontaneous Blinking. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems, pp. 1-12.

[3] Cho, Y., et al.,2019. Nose heat: Exploring stress-induced nasal thermal variability through mobile thermal imaging. In 2019 8th International Conference on Affective Computing and Intelligent Interaction (ACII), pp. 566-572

[4] Cho, Y. and Bianchi-Berthouze, N., 2019. Physiological and affective computing through thermal imaging: A survey. arXiv preprint arXiv:1908.10307.

[5] Cho, Y., Bianchi-Berthouze, N. and Julier, S.J., 2017. DeepBreath: Deep learning of breathing patterns for automatic stress recognition using low-cost thermal imaging in unconstrained settings. In 2017 Seventh International Conference on Affective Computing and Intelligent Interaction (ACII), pp. 456-463.

[6] Cho, Y., Kim, S. and Joung, M., 2017. Proximity sensor and control method thereof. U.S. Patent 9,703,368.

[7] Cho, Y., Joung, M. and Kim, S., 2018. Vehicle display apparatus including capacitive and light-based input sensors. U.S. Patent 9,891,756.