The demo starts by putting on the Varjo mixed reality headset. At this point the mounted passthrough cameras fire up and display video of the outside world in the headset, replicating what you would see without the headset on at all.
The main secret sauce to making this mixed reality (virtual and real elements combined) effect feel natural is the ultra-low latency of the 12-megapixel video feed of the ‘outside world’, which updates fast enough to not lag behind the movements of your head.
One of the most impressive visual factors of this mixed reality demo is how the birds will fly around the multiple cross-beams that span the width of the room, not only avoiding them but also being occluded or hidden when passing behind or above them. Their shadows will also fall appropriately onto the various surfaces.
This effect has been achieved by creating a simple 3D model of the space in advance, including the walls, floor, ceiling, and beams. This model is then used as a mask and shadow catcher material to inform the Unity-built application when the birds should not be visible, and where the shadows should fall.
The demo uses Unity’s Universal Render Pipeline, which handles shadows by allowing a shadow catcher material to be added to the atrium’s surfaces. Currently, the shadows shown have a little flicker and are not being drawn very accurately due to the sheer number of birds. There is however scope to improve this, and even to experiment with other shadow techniques such as raytracing.
As the birds navigate around the 3D space, and the headset wearer changes position, the occlusion is constantly correct according to the wearer’s perspective. The feeling of immersion created by this is quite astounding in person.
The Varjo HQ atrium is a well-lit room thanks to sky light windows covering the length of the ceiling, as well as having multiple floors of open corridors with fluorescent lighting. As the flying birds have a shiny, chrome-like texture, they are reflecting the multiple light sources around the room, as well as basic visual details, as they meander around their flight paths.
This effect was achieved by first creating a panoramic HDR (high dynamic range) capture of the space, using an off-the-shelf 360 camera, in advance. This process involves the camera taking images in a full range of different exposure settings, and then combining this data into a single 360 image to be fed into the Unity project.
Similar to the shadows, the reflections could be made even more detailed and impressive if using ray-tracing in future iterations of this demo. However, this will absolutely need very high-end GPU performance. We are lucky to have some of the best NVIDIA GPUs available today to hand at the Varjo HQ, including the NVIDIA RTX ADA 6000 (in-use during the recording of the demo video) and the NVIDIA RTX 4090, so implementation of ray-tracing is very plausible.
You may also notice, during the video, a glimpse of what looks like a QR code mounted onto one of the beams. This is a Varjo Marker; a physical reference point which allows the headset to accurately align the previously mentioned 3D model for masking.
The virtual contents of the whole demo can be repositioned if needed with just a quick glance up at the marker and a tap of the headset button, allowing the mixed reality passthrough cameras to recognize it and position the mask accordingly.
Ok so we get it, the ‘birds’ fly around and are hidden behind beams. But what is it that makes it all look so uncannily natural? This comes down to the flocking behavior. The birds appear to have their own individual agendas, choosing to follow flocks, avoid obstacles and each other, and move with purpose. Both random and chaotic, and yet simultaneously acting completely ‘by design’; a visual phenomenon that is usually only seen in nature.
This incredible effect has been achieved with the classic Boids algorithm. It’s a fascinating model that causes the birds to behave according to three ‘rules’, which can be configured as numerical values in the setup of the demo.
Finely balancing the values between these three rules creates very natural-looking flock-like movement patterns. The algorithm, and other interesting applications of it, are very well explained in this video by YouTuber Sebastian Lague.
To introduce the Boids-based behavior, a pre-built framework from the Unity Entity Component System called ECS Swarm was used. Created by user ‘Tigpan’, it’s an efficient way to introduce similar flocking behavior into your own projects.