Preprint 2022
Yu-Wei Yang and Youngjun Cho
Department of Computer Science, University College London, London, UK
EEG neurofeedback has shown positive effects on motor rehabilitation in daily settings. However, high heterogeneity between existing works has been identified given a lack of systematic validation of various intervention training strategies and terminologies adopted for different users, EEG devices and feedback methods. Here, we contribute a systematic review and meta-analysis, providing a constructive overview of the EEG neurofeedback interfaces and assessing the effectiveness of intervention strategies. We initially identify 5307 articles and focus on 62 key articles for systematic review and 35 eligible studies for meta-analysis on motor-related outcomes from which we report on significant improvements in motor performance. With our findings on the effectiveness of individual factors, we develop a taxonomy to inform the future research agenda of cutting-edge EEG neurofeedback systems and intervention strategies. We also provide a guideline for practitioners and end-users to choose optimal intervention approaches and insights into challenges and opportunities for future interface design (see our previous related work for insights [1-4]).
References
[1] Moge, C., Wang, K. and Cho, Y., 2022. Shared user interfaces of physiological data: Systematic review of social biofeedback systems and contexts in HCI. In Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems (pp. 1-16).
[2] Wang, K., Julier, S. and Cho, Y., 2022. Attention-Based Applications in Extended Reality to Support Autism: A Systematic Review. IEEE Access.
[3] Cho, Y., 2021. Rethinking eye-blink: Assessing task difficulty through physiological representation of spontaneous blinking. In Proceedings of the 2021 CHI conference on human factors in computing systems (pp. 1-12).
[4] Cho, Y., Kim, S. and Joung, M., 2017. Proximity sensor and control method thereof. U.S. Patent 9,703,368.
Preprint 2022
Selina He and Youngjun Cho
Department of Computer Science, University College London, London, UK
Public speaking is an essential soft skill for professional development. However, public speaking tends to be one of the stressors to university students. Here, we are interested in biofeedback, an intervention technique that can possibly hinder our negative emotional responses triggered by stress-related problems. Particularly, we investigate to what extent shared use of biofeedback across users (social biofeedback) can improve student presenters’ experiences during academic oral presentations. In a mixed-method study with 15 participants, we demonstrate a significant improvement of presenters’ affective states (toward higher arousal and higher valence) with social biofeedback in comparison with typical biofeedback and control. Also, our thematic analysis suggests its positive influences on presenters’ self-oriented regulations, others’-oriented regulations, and task-oriented regulations. We conclude by highlighting social biofeedback’s potential benefits in enhancing engagement and social connectedness in remote learning/teaching environments.
*This work builds on our previous projects (see details in [1-8]).
References
[1] Moge, C., Wang, K. and Cho, Y., 2022. Shared user interfaces of physiological data: Systematic review of social biofeedback systems and contexts in HCI. In Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems (pp. 1-16).
[2] Cho, Y., 2021. Rethinking eye-blink: Assessing task difficulty through physiological representation of spontaneous blinking. In Proceedings of the 2021 CHI conference on human factors in computing systems (pp. 1-12).
[3] Cho, Y., Bianchi-Berthouze, N. and Julier, S.J., 2017. DeepBreath: Deep learning of breathing patterns for automatic stress recognition using low-cost thermal imaging in unconstrained settings. In 2017 Seventh International Conference on Affective Computing and Intelligent Interaction (ACII), pp. 456-463.
[4] Cho, Y., Julier, S. J. and Bianchi-Berthouze, N., 2019. Instant Stress: Detection of Perceived Mental Stress Through Smartphone Photoplethysmography and Thermal Imaging. JMIR mental health, 6(4), e10140.
[5] Cho, Y. and Bianchi-Berthouze, N., 2019. Physiological and affective computing through thermal imaging: A survey. arXiv preprint arXiv:1908.10307.
[6] Cho, Y., et al.,2019. Nose heat: Exploring stress-induced nasal thermal variability through mobile thermal imaging. In 2019 8th International Conference on Affective Computing and Intelligent Interaction (ACII), pp. 566-572
[7] Cho, Y., Kim, S. and Joung, M., 2017. Proximity sensor and control method thereof. U.S. Patent 9,703,368.
[8] Cho, Y., Julier, S.J., Marquardt, N. and Bianchi-Berthouze, N., 2017. Robust tracking of respiratory rate in high-dynamic range scenes using mobile thermal imaging. Biomedical optics express, 8(10), pp.4480-4503.
Preprint 2022
Pauline Hohl and Youngjun Cho
Department of Computer Science, University College London, London, UK
Eye blinking is a subconscious behaviour that is associated with our affective states. While spontaneous eye blinking and its relationship with anxiety and stress have been actively explored in the literature, it has not yet been considered in a loop of biofeedback. Here, we explore biofeedback meditation mechanisms with eye-blink. In our within-participant experiment (n=19), we assess the impact of eye-blinking biofeedback on self-perceived anxiety and stress levels and on spontaneous eye blink rate (SEBR). This is done by exposing participants to two conditions (with visual biofeedback exercises, and with it) during an anxiety and stress-inducing mock job interview. Our results show no significant effects of the independent variable on self-rated anxiety/stress scores and SEBR; however, we have identified that eye blink-triggered visual biofeedback can be distractive given that it can lead to high perceptual load during the mock interview. Building upon participants’ feedback, we establish design implications and suggest further research directions that can potentially involve other modalities such as tactile or olfactory feedback for eye blink biofeedback. Note that this work builds on our previous projects (see details in [1-9]).
References
[1] Moge, C., Wang, K. and Cho, Y., 2022. Shared user interfaces of physiological data: Systematic review of social biofeedback systems and contexts in HCI. In Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems (pp. 1-16).
[2] Cho, Y., 2021. Rethinking eye-blink: Assessing task difficulty through physiological representation of spontaneous blinking. In Proceedings of the 2021 CHI conference on human factors in computing systems (pp. 1-12).
[3] Cho, Y., Bianchi-Berthouze, N. and Julier, S.J., 2017. DeepBreath: Deep learning of breathing patterns for automatic stress recognition using low-cost thermal imaging in unconstrained settings. In 2017 Seventh International Conference on Affective Computing and Intelligent Interaction (ACII), pp. 456-463.
[4] Cho, Y., Julier, S. J. and Bianchi-Berthouze, N., 2019. Instant Stress: Detection of Perceived Mental Stress Through Smartphone Photoplethysmography and Thermal Imaging. JMIR mental health, 6(4), e10140.
[5] Cho, Y. and Bianchi-Berthouze, N., 2019. Physiological and affective computing through thermal imaging: A survey. arXiv preprint arXiv:1908.10307.
[6] Cho, Y., et al.,2019. Nose heat: Exploring stress-induced nasal thermal variability through mobile thermal imaging. In 2019 8th International Conference on Affective Computing and Intelligent Interaction (ACII), pp. 566-572
[7] Cho, Y. and Joung, M., 2017. Display apparatus and method for operating the same. U.S. Patent 9,733,765.
[8] Cho, Y., Youn, J., Joung, M., and Kim, S., 2020. Vehicle display device and vehicle. U.S. Patent 10,534,440.
[9] Cho, Y., Julier, S.J., Marquardt, N. and Bianchi-Berthouze, N., 2017. Robust tracking of respiratory rate in high-dynamic range scenes using mobile thermal imaging. Biomedical optics express, 8(10), pp.4480-4503.
Preprint 2022
Jiani Wang and Youngjun Cho
Department of Computer Science, University College London, London, UK
We introduce Silent Rider, a visual-tactile messaging interface that aims to facilitate a hearing-impaired deliverer’s mobile phone communication with their customers. Silent Rider augments a typical smartwatch interface with a virtual module that interchanges users’ different input modalities. A customer’s speech is converted into tactons for a user with profound hearing loss to understand, while the corresponding user can make voice output through customized messaging functions. To implement this interface, we first identify design requirement, then we present the system workflow and designed functionality. Lastly, we make a discussion on future design opportunities to help hearing impaired people communicate in the workplace.
*This work builds on our previous projects (see details in [1-5]).
References
[1] Cho, Y., Joung, M. and Kim, S., 2015. Device and method for generating vibrations. U.S. Patent Application 14/758,397.
[2] Schmitz, A., Holloway, C. and Cho, Y., 2020. Hearing through vibrations: Perception of musical emotions by profoundly deaf people. arXiv preprint arXiv:2012.13265.
[3] Cho, Y., 2018. Sensorless Resonance Tracking of Resonant Electromagnetic Actuator through Back-EMF Estimation for Mobile Devices. arXiv preprint arXiv:1803.07065.
[4] Cho, Y. and Joung, M., 2017. Display apparatus and method for operating the same. U.S. Patent 9,733,765.
[5] Cho, Y., Youn, J., Joung, M., and Kim, S., 2020. Vehicle display device and vehicle. U.S. Patent 10,534,440.
Preprint 2021
Alok C. Suresh and Youngjun Cho
The ability to accurately detect and model emotion is of great importance in advancing HCI applications and interfaces. Given this, the prevalence of research into the automatic emotion recognition task has steadily increased in recent years. This work is concerned with this task, particularly experimenting with attention-based CNNs on spectrograms from physiological signals. We used the spectrograms of respiration and blood volume pulse signals to condense physiological information in a two-dimensional image format. They were subsequently used to train standard CNN models as well as attention based variants for discriminating between emotional states along with various levels of valence and arousal. Also, we investigated fusion methods for the spectrograms from the two different physiological sensing channels as well as annotation strategies for classifying emotions in supervised learning. All models explored in this work were evaluated on the DEAP dataset. This work builds on our previous projects [1-7], confirming the robustness of time-frequency representation of physiological signals in automatic emotion recognition.
References
[1] Cho, Y., 2021. Rethinking Eye-blink: Assessing Task Difficulty through Physiological Representation of Spontaneous Blinking. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems, pp. 1-12.
[2] Cho, Y., Bianchi-Berthouze, N. and Julier, S.J., 2017. DeepBreath: Deep learning of breathing patterns for automatic stress recognition using low-cost thermal imaging in unconstrained settings. In 2017 Seventh International Conference on Affective Computing and Intelligent Interaction (ACII), pp. 456-463.
[3] Cho, Y., Julier, S. J. and Bianchi-Berthouze, N., 2019. Instant Stress: Detection of Perceived Mental Stress Through Smartphone Photoplethysmography and Thermal Imaging. JMIR mental health, 6(4), e10140.
[4] Cho, Y. and Bianchi-Berthouze, N., 2019. Physiological and affective computing through thermal imaging: A survey. arXiv preprint arXiv:1908.10307.
[5] Cho, Y., Bianchi-Berthouze, N., Marquardt, N. and Julier, S.J., 2018. Deep thermal imaging: Proximate material type recognition in the wild through deep learning of spatial surface temperature patterns. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, pp. 1-13.
[6] Cho, Y., et al.,2019. Nose heat: Exploring stress-induced nasal thermal variability through mobile thermal imaging. In 2019 8th International Conference on Affective Computing and Intelligent Interaction (ACII), pp. 566-572
[7] Cho, Y., Julier, S.J., Marquardt, N. and Bianchi-Berthouze, N., 2017. Robust tracking of respiratory rate in high-dynamic range scenes using mobile thermal imaging. Biomedical optics express, 8(10), pp.4480-4503.
Preprint 2021
Bei Xia and Youngjun Cho
Background: It is often challenging for visually impaired people to navigate independently. To help this, an increasing number of studies have explored navigation systems for visually impaired users. Particularly, the ways that the users communicate with such systems through touch have been actively studied. In this paper, we aim: (1) to systematically synthesise the development of haptic feedback interfaces for navigation systems for the visually impaired community; (2) to review evaluation approaches that have been used in this field.
Method: A systematic search was conducted in the following databases: PubMed, IEEE Xplore, ACM and ScienceDirect. Through our initial screening on titles and abstracts, a total of 94 articles were initially selected from the databases. Then our full-text review resulted in 32 articles for in-depth analysis.
Results: Three key themes emerged in haptic feedback-enabled navigation systems: i) obstacle avoidance, ii) direction instruction and iii) cognitive mapping. Vibro-tactile feedback is the most frequently used for directing navigation information, communicating through specific body areas such as hand, abdomen, arms, feet and back. The feedback type and design was dominantly influenced by the body part chosen for interacting with a system. Completion time was most widely adopted as a dependant variable in evaluation while accuracy-based metrics were often overlooked. Numerical metrics were often complemented by post-study user feedback surveys and interviews.
Conclusions: This review summarises the scientific literature on haptic feedback for navigation, which helps develop our understanding of the design trend in the field and contribute future directions. We believe attending more to the target user groups from the visually impaired community will significantly benefit, obtaining hidden insights and in turn driving innovation.
This work builds on our previous projects [1-4].
References
[1] Cho, Y., Bianchi, A., Marquardt, N. and Bianchi-Berthouze, N., 2016. RealPen: Providing realism in handwriting tasks on touch surfaces using auditory-tactile feedback. In Proceedings of the 29th Annual Symposium on User Interface Software and Technology, pp. 195-205.
[2] Cho, Y., Bianchi-Berthouze, N., Marquardt, N. and Julier, S.J., 2018. Deep thermal imaging: Proximate material type recognition in the wild through deep learning of spatial surface temperature patterns. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, pp. 1-13.
[3] Cho, Y. and Joung, M., 2017. Display apparatus and method for operating the same. U.S. Patent 9,733,765.
[4] Cho, Y., Joung, M. and Kim, S., 2015. Device and method for generating vibrations. U.S. Patent Application 14/758,397.
Preprint 2021
Yuxuan Liu and Youngjun Cho
While alpha binaural beats and classical music treatment have been actively explored in alleviating state anxiety, no previous research has looked into a possibility of blending the two subtle interventions for amplifying their mental wellbeing benefit in daily settings. We hypothesize that classical music overlaid with binaural beats is more effective than classical music alone in reducing state anxiety. Twenty four participants were randomly assigned to one of three different intervention groups: binaural beats combined with classical music, classical music alone, and no intervention. We investigate this with POMS tension-anxiety questionnaire, facial features and attentional control game to assess the anxiety levels in our experiment. Our findings show that classical music with binaural beat most significantly reduces state anxiety. Furthermore, we evaluate the effectiveness of the physiological and behavioural anxiety measures and offer recommendations for future anxiety intervention research through sound.
This work builds on our previous projects [1-5].
References
[1] Cho, Y., Bianchi-Berthouze, N. and Julier, S.J., 2017. DeepBreath: Deep learning of breathing patterns for automatic stress recognition using low-cost thermal imaging in unconstrained settings. In 2017 Seventh International Conference on Affective Computing and Intelligent Interaction (ACII), pp. 456-463.
[2] Cho, Y., Julier, S. J. and Bianchi-Berthouze, N., 2019. Instant Stress: Detection of Perceived Mental Stress Through Smartphone Photoplethysmography and Thermal Imaging. JMIR mental health, 6(4), e10140.
[3] Cho, Y. and Bianchi-Berthouze, N., 2019. Physiological and affective computing through thermal imaging: A survey. arXiv preprint arXiv:1908.10307.
[4] Cho, Y., 2021. Rethinking Eye-blink: Assessing Task Difficulty through Physiological Representation of Spontaneous Blinking. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems, pp. 1-12.
[5] Cho, Y., et al.,2019. Nose heat: Exploring stress-induced nasal thermal variability through mobile thermal imaging. In 2019 8th International Conference on Affective Computing and Intelligent Interaction (ACII), pp. 566-572
Preprint 2021
Virtual reality (VR) has great potential as a technological intervention for disabled people. However, most human factor research into VR does not consider people with motor learning disabilities. Here, we consider the accessibility challenges faced by people with dyspraxia when using the current generation of VR head-mounted displays. Our work consists of two studies. The first is an exploratory workshop where people, both without and with dyspraxia, tried two commercial VR applications. This study showed dyspraxia sufferers had a greater difficulty in perceiving distance, leading to a higher level of mental workload. To investigate ways to mitigate this difficulty, we conducted a second lab-based experiment, exploring how readily adjustable physical features of current VR systems (interpupillary distance, the distance between the eyes and lenses in the headset) could be used to lower mental demands and improve accessibility. The outcome is used to propose design guidelines for future VR systems.
This work builds on our previous projects [1-7].
References
[1] Cho, Y., Julier, S. J. and Bianchi-Berthouze, N., 2019. Instant Stress: Detection of Perceived Mental Stress Through Smartphone Photoplethysmography and Thermal Imaging. JMIR mental health, 6(4), e10140.
[2] Cho, Y., 2021. Rethinking Eye-blink: Assessing Task Difficulty through Physiological Representation of Spontaneous Blinking. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems, pp. 1-12.
[3] Cho, Y., et al.,2019. Nose heat: Exploring stress-induced nasal thermal variability through mobile thermal imaging. In 2019 8th International Conference on Affective Computing and Intelligent Interaction (ACII), pp. 566-572
[4] Cho, Y. and Bianchi-Berthouze, N., 2019. Physiological and affective computing through thermal imaging: A survey. arXiv preprint arXiv:1908.10307.
[5] Cho, Y., Bianchi-Berthouze, N. and Julier, S.J., 2017. DeepBreath: Deep learning of breathing patterns for automatic stress recognition using low-cost thermal imaging in unconstrained settings. In 2017 Seventh International Conference on Affective Computing and Intelligent Interaction (ACII), pp. 456-463.
[6] Cho, Y., Kim, S. and Joung, M., 2017. Proximity sensor and control method thereof. U.S. Patent 9,703,368.
[7] Cho, Y., Joung, M. and Kim, S., 2018. Vehicle display apparatus including capacitive and light-based input sensors. U.S. Patent 9,891,756.
Preprint 2021
S Oh, C Holloway and Y Cho
Department of Computer Science, University College London, London, UK
Fully autonomous vehicles are emerging, potentially benefitting the visually impaired community. However, people with visual impairment have been by and large excluded in designing infotainment systems and interactive features in traditional vehicles. This puts into question if interfaces designed for the future of vehicles would make it more accessible and user friendly for visually impaired drivers. It has remained unclear how inclusive current design strategies on autonomous vehicles are for meeting the users’ needs. Here, we aim to examine current trends and challenges in this by conducting a mixed-method study. We analyzed patents on accessible user interfaces and self-driving vehicles to understand a current trend. Then we performed interviews with experts in the industry to obtain an in-depth insight into the trend and challenges the industry face. The result highlights a growing interest in inclusion and the need for bringing universal design practices in assistive technology to the field.
This work builds on our previous projects [1-7].
References
[1] Cho, Y., Youn, J., Joung, M., and Kim, S., 2020. Vehicle display device and vehicle. U.S. Patent 10,534,440.
[2] Cho, Y., Bianchi-Berthouze, N., Marquardt, N. and Julier, S.J., 2018. Deep thermal imaging: Proximate material type recognition in the wild through deep learning of spatial surface temperature patterns. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, pp. 1-13.
[3] Cho, Y., Joung, M. and Kim, S., 2018. Display apparatus for a vehicle. U.S. Patent 9,864,469.
[4] Cho, Y., Joung, M. and Kim, S., 2015. Device and method for generating vibrations. U.S. Patent Application 14/758,397.
[5] Cho, Y., Youn, J., Joung, M., and Kim, S., 2020. Vehicle display device and vehicle. U.S. Patent 10,534,440.
[6] Cho, Y., Kim, S. and Joung, M., 2017. Proximity sensor and control method thereof. U.S. Patent 9,703,368.
[7] Cho, Y., Joung, M. and Kim, S., 2018. Vehicle display apparatus including capacitive and light-based input sensors. U.S. Patent 9,891,756.
Preprint 2021
Yuliang Chen and Youngjun Cho
Department of Computer Science, University College London, London, UK
Fitness trackers have gained much attention as daily physiological computing intervention to promote self-monitoring and self-regulation. This paper investigates the usage patterns and perceived health benefits of commercial fitness trackers before and during the COVID-19. From our online survey and follow-up interview studies, we report on positive relationships between usage frequency and perceived health benefits along with the lockdown state. Furthermore, the results show users tend to get more motivated by wearing trackers to engage in physical activities and increase exercise intensity during the lockdown in comparison with pre-lockdown periods. We conclude by discussing their broader potential benefits for both mental and physical wellbeing.
This work builds on our previous projects [1-5].
References
[1] Cho, Y., 2021. Rethinking Eye-blink: Assessing Task Difficulty through Physiological Representation of Spontaneous Blinking. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems, pp. 1-12.
[2] Cho, Y., Julier, S.J., Marquardt, N. and Bianchi-Berthouze, N., 2017. Robust tracking of respiratory rate in high-dynamic range scenes using mobile thermal imaging. Biomedical optics express, 8(10), pp.4480-4503.
[3] Cho, Y., Bianchi-Berthouze, N. and Julier, S.J., 2017. DeepBreath: Deep learning of breathing patterns for automatic stress recognition using low-cost thermal imaging in unconstrained settings. In 2017 Seventh International Conference on Affective Computing and Intelligent Interaction (ACII) (pp. 456-463).
[4] Cho, Y., Julier, S.J. and Bianchi-Berthouze, N., 2019. Instant stress: detection of perceived mental stress through smartphone photoplethysmography and thermal imaging. JMIR mental health, 6(4), p.e10140.
[5] Cho, Y., Kim, S. and Joung, M., 2017. Proximity sensor and control method thereof. U.S. Patent 9,703,368.