The role of physiological responses in a VR-based sound localization task
Moraes, Adrielle N.
MetadataShow full item record
Virtual reality (VR) has recently emerged as a platform that can be employed in the context of e-health applications. Even though the majority of VR-based applications focus on visual stimuli as the main content, audio also plays a very important role. If someone has an issue with auditory processing, the comprehension of auditory information is compromised. This condition reflects negatively on one’s quality of life, given the impact of audio perception on one’s ability to communicate effectively or to differentiate and ignore “noise”. This work aims to design a VR application that can be used to: (a) optimize multimodal VR experiences based on user trials (b) extract user data continuously throughout the experiment (c) evaluate a user’s auditory processing ability. To accomplish this goal, participants are required to localise a sound source in space in the presence of multiple listening conditions with simple and complex sound stimuli configurations. Data collected from users consists of physiological and objective metrics. The results of this study highlight the relationship between user behaviour (head movement, fixation points) and performance in the sound localisation task. This information can be used to design future applications with the purpose of training one’s auditory localisation ability. In addition, the evaluation compared the impact of using two interaction methods to perform this task: using a pointer or eye gaze to indicate the location of the target source. The findings from this study show statistically significant differences in terms of physiological response when subjects are exposed to different interaction methods, with greater immersion and performance for the pointer group.
The following license files are associated with this item: