Talk to sales

Creating the world’s most immersive mixed reality demo

March 24, 2023
by Jani Ylinen, 3D Graphics Specialist, Varjo
XR/VR tips and tricks

If you’ve ever been sat watching Hitchcock’s ‘The Birds’, and thinking ‘this would be far more pleasant if the birds were reflective letter V’s, who didn’t want to kill me’, then you’re in the right place.

The Varjo Birds mixed reality demo conveys the breathtaking sight of starlings swarming in undulating patterns across the sky, without any ominous fowl consequences. It also happens to be a mind-blowing showcase of the deep immersion achievable with Varjo’s industry-leading mixed reality headsets.

Mixed reality, combining virtual and real content in one view

The demo starts by putting on the Varjo mixed reality headset. At this point the mounted passthrough cameras fire up and display video of the outside world in the headset, replicating what you would see without the headset on at all.

The main secret sauce to making this mixed reality (virtual and real elements combined) effect feel natural is the ultra-low latency of the 12-megapixel video feed of the ‘outside world’, which updates fast enough to not lag behind the movements of your head.

Mixed Reality demo

Virtual ‘V’s and futuristic flocks

After triggering the demo to start, 12,000 individual ‘birds’ venture out from starting spherical formations and begin to fly around the room. In this case, the room is the central atrium space of the Varjo headquarters in Finland, seen often in our webinars and demo presentations.

The birds are simple low polygon shapes that resemble the letter ‘V’ used in the Varjo logo. Keeping the individual birds to a low poly-count plays a role in how all 12,000 of them can be rendered at once, whilst retaining an ultra-fluid high framerate experience (up to 90 frames per second.)

In fact, 12,000 is absolutely not the limit. With the right GPU and CPU oomph, the number of birds could very feasibly be upped to 25,000+, but of course a balance between jaw-dropping and visually overwhelming has to be struck.


Occlusion and shadows

One of the most impressive visual factors of this mixed reality demo is how the birds will fly around the multiple cross-beams that span the width of the room, not only avoiding them but also being occluded or hidden when passing behind or above them. Their shadows will also fall appropriately onto the various surfaces.

Mixed Reality demo

This effect has been achieved by creating a simple 3D model of the space in advance, including the walls, floor, ceiling, and beams. This model is then used as a mask and shadow catcher material to inform the Unity-built application when the birds should not be visible, and where the shadows should fall.

The demo uses Unity’s Universal Render Pipeline, which handles shadows by allowing a shadow catcher material to be added to the atrium’s surfaces. Currently, the shadows shown have a little flicker and are not being drawn very accurately due to the sheer number of birds. There is however scope to improve this, and even to experiment with other shadow techniques such as raytracing.

As the birds navigate around the 3D space, and the headset wearer changes position, the occlusion is constantly correct according to the wearer’s perspective. The feeling of immersion created by this is quite astounding in person.


Reflecting reality

The Varjo HQ atrium is a well-lit room thanks to sky light windows covering the length of the ceiling, as well as having multiple floors of open corridors with fluorescent lighting. As the flying birds have a shiny, chrome-like texture, they are reflecting the multiple light sources around the room, as well as basic visual details, as they meander around their flight paths.

This effect was achieved by first creating a panoramic HDR (high dynamic range) capture of the space, using an off-the-shelf 360 camera, in advance. This process involves the camera taking images in a full range of different exposure settings, and then combining this data into a single 360 image to be fed into the Unity project.

Mixed Reality space Mixed Reality 360 camera

Similar to the shadows, the reflections could be made even more detailed and impressive if using ray-tracing in future iterations of this demo. However, this will absolutely need very high-end GPU performance. We are lucky to have some of the best NVIDIA GPUs available today to hand at the Varjo HQ, including the NVIDIA RTX ADA 6000 (in-use during the recording of the demo video) and the NVIDIA RTX 4090, so implementation of ray-tracing is very plausible.


Aligning the virtual with the physical

Mixed Reality demo

You may also notice, during the video, a glimpse of what looks like a QR code mounted onto one of the beams. This is a Varjo Marker; a physical reference point which allows the headset to accurately align the previously mentioned 3D model for masking.

Varjo Marker QR Code

The virtual contents of the whole demo can be repositioned if needed with just a quick glance up at the marker and a tap of the headset button, allowing the mixed reality passthrough cameras to recognize it and position the mask accordingly.


Nailing the bird-like behavior

Ok so we get it, the ‘birds’ fly around and are hidden behind beams. But what is it that makes it all look so uncannily natural? This comes down to the flocking behavior. The birds appear to have their own individual agendas, choosing to follow flocks, avoid obstacles and each other, and move with purpose. Both random and chaotic, and yet simultaneously acting completely ‘by design’; a visual phenomenon that is usually only seen in nature.

Birds flock

This incredible effect has been achieved with the classic Boids algorithm. It’s a fascinating model that causes the birds to behave according to three ‘rules’, which can be configured as numerical values in the setup of the demo.

  • Alignment – How much the birds attempt to fly in the same direction as their flock mates
  • Cohesion – How closely the flocks attempt to clump together
  • Separation – How much the birds attempt to keep their distance from their fellow flock mates

Finely balancing the values between these three rules creates very natural-looking flock-like movement patterns. The algorithm, and other interesting applications of it, are very well explained in this video by YouTuber Sebastian Lague.

To introduce the Boids-based behavior, a pre-built framework from the Unity Entity Component System called ECS Swarm was used. Created by user ‘Tigpan’, it’s an efficient way to introduce similar flocking behavior into your own projects.


Keep up to date with Varjo

To keep up to date with other projects that push the boundaries of what is possible in virtual and mixed reality, consider subscribing to the Varjo Newsletter.

Browse more Varjo Insider posts

Talk to sales