How to Create an Immersive VR Art Experience
A showcase of cutting-edge VR art, Two Rocks Do Not Make a Duck, is an augmented virtuality artwork made by artists Cameron MacLeod and Magnhild Øen Nordahl. In this blog post the artists will describe how they made it and some of their thoughts behind it.
Description of the artwork
The installation was recently shown at the Munch Museum where we installed a 3-dimensional floor with three interactive, rock-shaped sculptures in front of a window overlooking Akerselva, Oslofjorden and downtown Oslo. When a user put on the VR-headset, they would see a simulation of the landscape surrounding the museum as well as the floor and the rocks. What the simulation did not include was the buildings, streets or other elements of the city, which were instead replaced by a virtual wilderness environment.
At the beginning of the experience the virtual sun, moon, and stars were all synchronized to the user’s location and point in time, ensuring that the lighting of the physical space matched the virtual world. As soon as the user picked up one of the rocks the time in VR and the actual time would start to differ, and the environment changed accordingly. One rock changed the time of day, and let the user experience Bjørvika change from morning to midnight. The faster the user moved, the faster the changes happened, and many museum visitors enjoyed just sitting on the floor watching the starry night sky. The second rock changed the season and took the user through spring, summer, fall and winter. The third rock allowed the users to play with the weather and see fierce thunderstorms as well as calm summer days with the sun reflecting off the glassy water. When the users built a cairn, also called a duck in the Canadian saying that the title refers to, VR-time would be reset to align with actual time again, meaning that the relationship between sun and earth again was the same in the simulation as in physical space.
The correspondence between VR and physical space remained strong also in other ways. The user’s position in the landscape always stayed the same inside and outside VR. It was not possible to teleport through the virtual scenery, and the perimeters of the exhibition space functioned as boundary movement also in VR. We felt that this parallel grounding created a convincing experience of a truly mixed reality. The experience of a mixed reality could also extend to after the visitors had left the museum, carrying with them an embodied memory of being physically present in this different version of Bjørvika, a feeling sometimes referred to as embodied retention.
Blending Art and Technology
Magnhild’s practice as a sculptor combined with Cameron’s interest in new technologies became starting points for our collaboration. We wanted to explore the artistic potential of augmented virtuality, to figure out how the combination of interactive sculpture and virtual reality could be used differently than how we had seen it in artistic and commercial use so far. The contemporary art field is very influenced by the development in the technology field, as it produces new tools for us to make work with.
One way that contemporary art can also contribute to the technology sector is by using its products in unexpected ways: combining it with new elements, presenting it in different ways and contexts and in this way contribute to expanding both how the medium is understood and used. In such a way we believe that our artwork also has relevance for the technology sector.
How it was made
“Two Rocks do Not Make A Duck” was developed between 2018-2022, a time during which we witnessed an impressive technological development for VR. Each step of the working process was influenced by new software and hardware releases, and we were constantly testing the limits of what was possible to do with the current technology, and changing the piece accordingly.
The environment was developed in Unreal Engine 5.0.3, with Ultra Dynamic Sky providing the base template for the sky and weather patterns as well as accurate positioning of sun and other celestial bodies. The Quixel Megascans repository supplied us with the high-detail assets necessary for creating the terrain and foliage. To make the physical rocks we 3D printed the negative shapes of rock assets from Quixel as molds, then poured acrylic plaster material into them. We also designed 3D printed fixtures to attach Vive 3.0 trackers directly onto the rocks, ensuring accurate coordinates for tracking in the VR scene. Four Vive 2.0 base stations were used to track the rocks and control the weather and time in the Ultra Dynamic Sky blueprint. The floor was created using Quixel Mixer, incorporating several natural ground textures from the repository. The resulting mesh was CNC milled and coated with acrylic plaster. Synchronization between the physical floor and the virtual environment was achieved using Vive trackers.
The local terrain was recreated using scanned data from the Norwegian Mapping Authority, which was converted into depth maps and extruded onto a landscape in Unreal Engine.
The Varjo headset was chosen for its superior resolution, clarity, and seamless integration with Unreal Engine. The Varjo support team played a crucial role in optimizing the project, helping us maintain high fidelity while managing the complex dynamic scene.
Utilizing Varjo and Unreal as our primary platforms was in hindsight the best choice for the project, which ran for three months, serving hundreds of museum visitors without any major technical issues. While we consider this a success, the way to get there was not without obstacles. A main challenge was to achieve compatibility between all of the different software and hardware components. The Varjo team helped us in fine-tuning the settings and providing quick adjustments through Varjo Base, resulting in the desired look and feel for the project.