Abstract
There will be two parts (half-hour each) to this colloquium.
Abstract - Part I. Characterization of natural eye and head movements driving retinal flow
In the absence of moving objects, retinal flow is determined by eye velocity relative to the environment as well as by the structure of the environment. Eye velocity in space is the sum of head-in-space and eye-in-head velocity. To gain a better understanding of head and eye movement driving retinal flow, we developed an ideal observer model of this process based on the assumption that observers tend to fixate features of the stationary environment. The model predicts retinal flow is driven most strongly by 1) linear head velocity, 2) fixation direction and distance, and 3) the structure of the environment. We also developed a system to measure both head and eye velocity during everyday behaviors outside the lab. The system consists of a Pupil Labs eye tracker with an Intel RealSense t265 tracking camera rigidly attached to the world camera. The tracking camera reconstructs head velocity using a computer vision algorithm known as simultaneous localization and mapping (SLAM). Head and eye movements were recorded for participants walking around campus. We present preliminary data collected using this device. Specifically, we present statistics of linear head velocity and head orientation relative to gravity and discuss the implication for the perception of heading and orientation as well as for statistics of retinal flow.
Part II. - Underwater virtual reality system for neutral buoyancy training: development and evaluation
Abstract
During terrestrial activities, sensation of pressure on the skin and tension in muscles and joints provides information about how the body is oriented relative to gravity and how the body is moving relative to the surrounding environment. In contrast, in aquatic environments when suspended in a state of neutral buoyancy, the weight of the body and limbs is offloaded rendering these cues uninformative. It is not yet known how this altered sensory environment impacts virtual reality experiences. To investigate this question, we converted a full-face SCUBA mask into an underwater head-mounted display and developed software to simulate jetpack locomotion outside the international space station. Our goal was to emulate conditions experienced by astronauts during training at NASA’s Neutral Buoyancy Lab. A user study was conducted to evaluate both sickness and presence when using virtual reality in this altered sensory environment. We observed an increase in nausea related symptoms underwater, but we cannot conclude that this is due to VR use. Other measures of sickness and presence underwater were comparable to measures taken above water. We conclude with suggestions for improved underwater VR systems and improved methods for evaluation these systems based on our experience.