
Engineering the Future of Flight: University of Maryland’s Flight Simulation and Control Lab Drives Breakthrough VR/XR Research
At UMD’s Extended Reality Flight Simulation and Control Lab, tomorrow’s flight training breakthroughs are being developed today. The department is building next-generation technologies aimed at improving aviation safety, pilot performance, and training efficiency.
Advancing aerospace research and innovation
The Department of Aerospace Engineering at the University of Maryland (UMD) is pioneering the future of aviation through breakthrough research. At the heart of these innovations is the Extended Reality Flight Simulation and Control Lab, directed by Assistant Professor Umberto Saetti.
The lab specializes in advanced virtual, augmented, and extended reality simulations (VR/XR) for flight training and evaluation. Currently, it supports around 15 personnel spanning from Master’s and PhD students to postdoctoral researchers.
Their research covers a broad spectrum of critical aerospace areas, including flight dynamics, vertical lift technologies, and human-machine interaction. At the core of their work is exploring new methods to enhance aviation safety, pilot performance, and training efficiency, focusing on approaches that have never been attempted before.
Since its establishment approximately two and a half years ago, the lab has secured roughly $3 million in research funding from prestigious organizations such as the U.S. Army, U.S. Navy, National Science Foundation, Lockheed Martin, and NASA.

The new frontier for university labs
Immersive headset-based simulations have emerged as powerful tools for conducting effective evaluation and research. For aerospace studies, these technologies enable the replacement of large dome-based physical cockpits with virtual environments that are more flexible, cost-effective, and accessible.
Benefits of VR/XR research simulations:
• Cost efficiency: Significantly reduces simulator costs, making advanced simulation technology financially accessible for universities.
• Reduced physical footprint: Eliminates the need for large screens and bulky structures, drastically decreasing the mass, inertia, and size of the simulators.
• Flexibility and versatility: Enables rapid and easy switching between different configurations without requiring separate physical setups.
• Multi-user capabilities: Affordable enough to acquire multiple units that can be linked together, allowing e.g. simultaneous multi-pilot and multi-aircraft simulations.
• Enhanced immersion and realism: Headsets can be mounted on various motion platforms and allow for previously impossible simulation maneuvers.
• Improved visibility: Provides superior 360° look-down and peripheral vision capabilities, surpassing the limitations of physical screens.
Unlocking the full potential of human senses in flight
UMD’s Extended Reality Lab’s research focuses specifically on integrating visual feedback with diverse sensory inputs, such as haptics, active vestibular stimulation, and spatialized (3D) audio, to improve training effectiveness. The team leverages various wearable physiological sensing tools to accurately measure pilot workload during simulations and to actively manage and optimize the experience.
“My research focuses on developing a “super pilot” by leveraging all available sensory cues,” explains Assistant Professor Umberto Saetti. As pilots primarily rely on vision and their sense of equilibrium when flying, other senses, such as touch and hearing, are generally underutilized in aircraft control. “We’re investigating ways to effectively use these additional sensory inputs to enhance pilot performance and improve aviation safety, especially in off-nominal situations,” Saetti says.
Most of the lab’s projects involve evaluating how well pilots control the aircraft in a VR setting and testing out haptic cueing algorithms that provide physical feedback during the flight. For example, in collaboration with Lockheed Martin, the researchers have investigated how tactile feedback (the sense of touch) can augment pilot perception and help to improve performance in visually degraded scenarios. They’ve developed algorithms and methods that help pilots when they’re flying in darkness, through clouds, or under challenging circumstances, ultimately increasing flight safety.
More recently, in a project funded by the National Science Foundation, they’re examining whether haptic feedback can help reduce pilot training time in the future. Early results are promising, indicating that tactile cues can accelerate training for both piloted rotorcraft and remotely operated vehicles.
Maximizing the in-air sensations
One of the key features of UMD’s XR Lab is its state-of-the-art flight simulators, delivered by Swiss manufacturer BRUNNER Elektronik. Thanks to advanced motion cueing, pilots experience a range of physical sensations of flight from subtle shifts to full-force maneuvers through the motion platforms.
While Prepar3D provides the visual environment during simulations, the flight dynamics—the way the aircraft actually behaves—are custom-built by the lab’s own research team.
What makes the experience even more immersive is the use of Varjo XR-3 and XR-4 headsets’ advanced mixed reality passthrough technology. Mixed reality allows pilots to physically interact with cockpit elements, as part of the cockpit remains physical while the rest is represented virtually.
To round out the experience, the lab incorporates a range of wearable haptic solutions, including haptic gloves for tactile interaction, Teslasuits that stimulate muscles to mimic the physical forces of flight, and bHaptics vests that deliver vibrational feedback to enhance sensory realism.

Mapping out the mind of a pilot

Understanding the pilot’s mind is just as critical as simulating the aircraft. The UMD research team places a strong emphasis on monitoring brain activity to gain deeper insight into a pilot’s cognitive workload and overall state. “Our goal is to optimize pilot performance throughout a mission, whether in military or civil aviation contexts,” says Saetti.
To achieve this, the lab employs wearable brain-monitoring devices, including headsets equipped with functional near-infrared spectroscopy (fNIRS) and electroencephalogram (EEG) sensors from leading providers like Wearable Sensing and OpenBCI. OpenBCI’s advanced biosensing headset Galea seamlessly integrates EEG sensors with Varjo’s virtual and mixed reality hardware. This fusion allows researchers to capture real-time brainwave data as pilots navigate immersive flight environments.
Together, these tools enable a comprehensive, real-time mapping of the brain and body during flight simulation, providing a window into the pilot’s mental effort and cognitive engagement.
Trusted tech for advanced research
According to Assistant Professor Saetti, the VR/XR research lab turns to Varjo headsets for projects where precision, realism, and seamless integration are non-negotiable. One of the standout features is Varjo’s Focal Edition option, which offers ultra-high-resolution close-distance visuals exactly where pilots need it the most—at close range within the cockpit.
“That’s one reason we’ve been very pleased with the Varjo XR-3 and XR-4 Focal Editions,” Saetti says. “We chose Varjo headsets specifically because Varjo is a market leader and selecting them was simply a no-brainer.”
The lab also values Varjo devices’ ability to integrate seamlessly with systems and software from other key technologies in their simulation ecosystem. “Their seamless integration with BRUNNER simulators has also made the setup and use straightforward for us,” Saetti notes. “Additionally, Varjo’s excellent customer and technical support has consistently met our needs.”

Charting the next wave of simulation technology
With their research at the forefront of innovation, UMD’s XR Lab is constantly developing technologies that go beyond current commercial capabilities. The team is next exploring how their sensory cueing methods, like haptic feedback, could boost engagement in remote operations like drone piloting.
In these scenarios, pilots often rely only on visual input, which can lead to fatigue and reduced alertness. The researchers’ aim is to explore whether incorporating haptic feedback could increase pilot engagement and introduce a sense of physical connection to the remote aircraft.
The lab is also venturing into a groundbreaking area known as galvanic vestibular stimulation (GVS)—a technique that sends motion signals directly to the brain through a headset using electrodes. While still in early development, GVS has the potential to simulate sensations of motion, like turning or acceleration. As current motion platforms come with some constraints in acceleration bandwidth and movement range, GVS could provide a more realistic simulation experience for these extreme flight conditions and potentially supplement or even replace physical motion platforms.
This could revolutionize how high-g maneuvers, planetary landings, or even weightlessness in space training are simulated. Though challenges remain—such as ensuring repeatability and consistency across different individuals—the UMD lab can envision a future where GVS becomes a viable tool for creating ultra-realistic flight experiences.
“Collaborating with technology partners to explore this further could unlock incredible potential. Our ultimate goal would be to develop this technology into a viable commercial product in the future,” Saetti concludes.