Calling all academic institutions: Special offer on the Varjo XR-4 Series available until Nov 30. Apply now.

Talk to sales

Behind the Scenes of Project Antoinette: Creating a Photoreal, Accurate, and Real-Time Flight Simulator

August 30, 2022
by John Burwell, Global Lead for Simulation and Training, Varjo, Niclas Colliander, Managing Director of Meta Immersive Synthetics (MIS), Séb Lozé, Unreal Engine Business Director for the Simulation Division
|
Flight Sim, Training and Simulation

In May of this year, Epic Games partnered with Varjo, Meta-Immersive Synthetics (MIS), and Brunner, to showcase the Antoinette Project demo at World Aviation Training Summit (WATS) in Orlando, Florida. The experiment demonstrated how Unreal Engine and partner technologies could produce a cost-efficient, highly deployable, extremely immersive, military-grade flight simulator.

In interviews with the major participants, the motivation, inner workings, and results of the Antoinette Project reveal a pathway for Unreal Engine creators and developers who want to build simulators powered by the popular game engine.

We sat down with Seb Lozé, Unreal Engine Business Director for the Simulation Division; Niclas Colliander, Managing Director of Meta Immersive Synthetics (MIS); and John Burwell, Global Head of Simulation and Training at Varjo to tell us more about their participation and results of the project.

Image courtesy of Meta Immersive Synthetics
Image courtesy of Meta Immersive Synthetics

Epic Games designed the Antoinette Project to fulfill several needs within the simulation industry. Tell us why the Antoinette Project was created and what challenges were solved.

Seb: Many industry participants and new entrants to the simulation community wanted guidance on where and how to start building a flight simulator with Unreal Engine. We knew we could meet that need by providing a comprehensive set of resources to support the creation of the next generation of flight simulators. With that, Antoinette Project was born.

The project’s name gives homage to La Société Antoinette, the French simulation pioneers. In 1906, they created the Antoinette Barrel as the first known method to demonstrate to pilots what they would sense when flying an airplane. We wanted to pay tribute to these pioneers in simulation.

With Unreal Engine as the creative platform, what else did you need to create the Antoinette Project flight simulator demo?

Seb: Rather than reinvent the wheel, we decided to work with some key players in the field to build a portable demo and create an inspiration reference for our community of developers. In addition to the Unreal Engine platform, we needed companies that offered the physical platform, the visual platform, and integration of all of the devices and software. We chose Brunner, Varjo, and Meta Immersive Synthetics to support the Antoinette Project. These partner companies, while competitors in some respects, all have a goal of developing high-fidelity, cost-effective simulation training. I think we all see virtual and mixed reality as the future of simulation training in a wide range of areas, so providing a pathway for others to develop training like this is a great start in moving this forward.

All elements of the Antoinette Project debuted at World Aviation Training Symposium 2022 in May.

The demo included NOR, a software framework from MIS, which enables developers to build training scenarios and serious flight simulation applications by leveraging all the expertise of Meta Aerospace. You can read more about this on the Unreal Engine blog.

It also included Brunner’s highly deployable 6DOF motion platform, which uses advanced motion-cueing algorithms and high-fidelity control loading units. The kinaesthetic cues given by the platform let the pilot feel the aircraft they’re sitting in. The feedback given by the aircraft not only immerses the pilot much more, but also trains muscle memory and the feeling for the aircraft’s movements and forces. Brunner worked on their Unreal Engine integration, leveraging the open-source motion-cueing interface developed by Epic Games. The plugin enables companies to develop their own for any motion-cueing solution they need.

Finally, the demo also featured the Varjo VR headset, which produces high-resolution, out-the-window scenes in a very portable way. You can read more about it on the Varjo website.

Varjo received an Epic Games MegaGrant prior to this project. Did that play a role in how Varjo was able to participate in the Antoinette Project ?

John: Epic Games awarded Varjo with a MegaGrant to further develop our mixed reality support for Unreal Engine. With the grant, Varjo moved to OpenXR, creating the Varjo mixed reality headset, which offers human-eye resolution visual fidelity, integrated depth-sensing, and low-latency video pass-through mixed reality. With OpenXR as the target interface, developers now have access to the industry’s most advanced enterprise-grade mixed reality features to support composing real and virtual environments for a wide variety of applications.

The support from Epic Games let us expand our delivery of mixed reality solutions for the most demanding enterprise VR/XR applications through Unreal Engine. Specifically on the Antoinette Project, Varjo’s OpenXR features include full support of photorealistic visual fidelity, eye tracking, and real-time chroma keying.

What did you see as the most significant aspect of the pilot simulation you created through the Antoinette Project?

John: What we’re seeing with the Antoinette Project is top-quality graphics and exceptional amounts of realism. That’s really important for pilots when you’re trying to create the suspension of disbelief that you need with immersive technologies.

In virtual and mixed reality training, high visual fidelity is critical for pilots to see cockpit displays and objectives clearly. XR training setups, unlike traditional displays, can cut the cost of training devices by half to a full order of magnitude with little loss of fidelity.

The lower device and operational costs brought by VR and XR equate to greater availability of training tools, which permits trainees to achieve more reps and sets, and encourages them to repeat tasks until they achieve mastery, and all-around better scalability.

XR Pilot Training

MIS played a major role in the Antoinette Project as the software integrator. What can you tell us about MIS’ contribution?

Niclas: MIS provided our software, NOR platform, which is an Unreal Engine-based simulation platform that we’ve been building for the last couple of years. And specifically for the Antoinette Project, we also used the Euro fighter typhoon model.

NOR is a simulation engine based on Unreal Engine, but we build all the frameworks for simulating aircraft, ground vehicles, sensors, things like that. We’re also developing our own solid footing planet and environment in which you can simulate things.

We started developing NOR in partnership with Epic. It offers high-fidelity flight simulation while also providing a rich world down at eye level, complete with dust, weather, sound, and other factors that impact the mission.

There are always challenges when integrating hardware and software, but there were no major challenges on this one. MIS already had integration from working with the Varjo headsets before, and the only integration that we had to do was with the Nova platform from Brunner. We hadn’t used that one before, but it was a fairly painless process, due to the fact that we had been experimenting with motion platforms before.

Image courtesy of Meta Immersive Synthetics

What kind of feedback did you receive when you unveiled the Antoinette Project at WATS 2022?

Niclas: The feedback on the Antoinette Project during the WATS show was very good, both from a graphical perspective and also from a usability perspective. People from all different kinds of branches have been complimentary and that is very important. We knew a couple of retired pilots that did the demo, and they were very impressed. So, you know, when you can get pilots in there, and they give you the thumbs up, you’re doing something right! And the thing is, I’m a fighter pilot myself, so I’m pretty confident in what we do. But it’s always good to see that others like it too.

We are technically very hardware-agnostic. When it comes to HMD, we just want to use the best there is. When it comes to the Varjo headset, it is the highest-resolution headset out there right now. So, in air tactics, being able to read dials and gauges and tactical screens in the aircraft is very important. The dynamic foveated rendering in the Varjo headset makes that high resolution possible.

With dynamic foveated rendering, you track the eyes and only render that higher resolution where you’re actually looking at any given moment. That makes it easier for the PC to keep up with the high resolution and enables high performance at the same time. This also helps with not making people sick. You need a higher frame rate to avoid that and if you’re pushing a lot of pixels for a higher resolution and a lot of graphical effects, that gets hard. Having foveated rendering gives higher performance at that higher number of pixels and that’s why the combination of high resolution and dynamic foliated rendering is key.

In general, NOR and Varjo are a pretty nice combination, because NOR, leveraging UE, has a very high graphical fidelity, and if you have a headset that doesn’t have high resolution, you kind of lose a lot of that to a blurry image in your headset. So that combination becomes not only a nice demo, but a pretty nice use case. We prefer using the Varjo headset because that means that people actually see how good things look in NOR.

Image courtesy of Meta Immersive Synthetics

In my opinion in, most use cases, VR or MR is going to take over from classical systems such as domes or screens. It’s already happening to some extent, but I think the adoption has been slower than people expected both in the defense space and obviously recreational spaces. But for simulation training, VR and MR are going to take over. And I think we’re starting to see even the bigger OEMs move in the direction of VR.

The Antoinette Project showed that industry players can come together to create a very effective simulation training tool.

Seb, you mentioned a comprehensive set of resources that Epic developed to help other creators and developers follow in the Antoinette Project’s footsteps. Can you talk about those resources?

Seb: The DIY tutorial illustrates how simple and fast it is to create a basic flight simulator. The tutorial provides instructions on how to connect input control devices for the pilot interface, such as keyboard and mouse, gamepad, joystick, or flight-specific control device. It also shows how to integrate an aircraft model from the Unreal Engine Marketplace, add accurate flight dynamics using the open source JSBSIM plugin for Unreal Engine, and simulate flying above world data using the Cesium for Unreal or the ESRI ArcGIS Maps SDK for Unreal Engine.

Image courtesy of Epic Games

The Trends and Best Practices for Flight Simulation information paper provides guidelines to use Unreal Engine in this context. The paper also contains the details about Varjo’s support to the Antoinette Project.

The JSBSim plugin for Unreal Engine was designed for flight simulation based on an open-source flight dynamics application.

To start your own project and build any VR or XR applications using the Unreal Engine in conjunction with a preferred Varjo HMD, download the most recent plugin from Varjo on OpenXR.

The Antoinette Project proved that through collaboration and commercial technologies, the creation of photoreal, accurate, and real-time simulators has never been more accessible.