Long Cheng1,3, Valentin Holzwarth2 , and Andreas Kunz3
Video see-through (VST) cameras, integrated in virtual reality (VR) headsets, offer a convenient means to bridge the real world and virtual objects. However, performing tasks within a VST system can deviate from real-world experiences. In this study, we assess the technical specifications of a commercially available consumer-grade virtual reality headset and investigate user performance across various tasks within a VST environment using a pilot user study and the prism adaptation method. Tasks include object relocation, drinking, screwing, and typing on a physical tablet. Our findings reveal a decline in performance for tasks requiring close-range interaction and screenbased operations, accompanied by a user adaptability to the studied tasks. We also note mild motion sickness symptoms but find no discernible aftereffects associated with the tasks examined.
Keywords: Mixed reality, Human factor study, Human adaptation, Virtual reality
1RhySearch, Buchs, 9470, Switzerland
2Atlas VR, Schlieren, 8952, Switzerland
3ETH Zurich, Zurich, 8092, Switzerland
Our website uses cookies so that we can continually improve the page and provide you with an optimized visitor experience. If you continue reading this website, you agree to the use of cookies. Further information regarding cookies can be found in the data protection note.
If you want to prevent the setting of cookies (for example, Google Analytics), you can set this up by using this browser add-on.