Unlocking the Full Potential of Human Senses in Flight
UMD’s Extended Reality Lab’s research focuses specifically on integrating visual feedback with diverse sensory inputs, such as haptics, active vestibular stimulation, and spatialized (3D) audio, to improve training effectiveness. The team leverages various wearable physiological sensing tools to accurately measure pilot workload during simulations and to actively manage and optimize the experience.
As pilots primarily rely on vision and their sense of equilibrium when flying, other senses, such as touch and hearing, are generally underutilized in aircraft control. “We’re investigating ways to effectively use these additional sensory inputs to enhance pilot performance and improve aviation safety, especially in off-nominal situations,” Saetti says.
Most of the lab’s projects involve evaluating how well pilots control the aircraft in a VR setting and testing out haptic cueing algorithms that provide physical feedback during the flight. For example, in collaboration with Lockheed Martin, the researchers have investigated how tactile feedback (the sense of touch) can augment pilot perception and help to improve performance in visually degraded scenarios. They’ve developed algorithms and methods that help pilots when they’re flying in darkness, through clouds, or under challenging circumstances, ultimately increasing flight safety.
More recently, in a project funded by the National Science Foundation, they’re examining whether haptic feedback can help reduce pilot training time in the future. Early results are promising, indicating that tactile cues can accelerate training for both piloted rotorcraft and remotely operated vehicles.