Upgrade offer for XR-3/VR-3 owners available. Learn more.

Talk to sales

Blend Virtual and Real – Key Insights from the Chroma Key & Visual Marker Webinar

In our product update webinar in April 2020, we had the pleasure to introduce and demo Varjo’s latest software features – real-time chroma keying and visual marker tracking to help merge the synthetic and analogue realities better than ever before.

See Q&A

Let’s dive deeper into the key insights from the webinar

Presenters:

Urho Konttori
Co-founder & Chief Product Officer at Varjo

Jussi Mäkinen
Chief Marketing Officer at Varjo

Ville Timonen
Mixed Reality Lead at Varjo

Saku Tiainen
Mixed Reality Developer at Varjo

 

Watch webinar

 

Implementing chroma key enables mixed reality scenarios where you can place real-life physical objects within a fully immersive virtual reality environment. You can also effortlessly switch between virtual, mixed, and real scenarios, enabling new levels of collaboration between you and your colleagues. At the end of the post, you can find the Q&A to the unanswered questions.

 

 

Chroma keying – seamless blending of physical objects within virtual scenery

Chroma keying is a technology that the broadcasting and film industry have been using for decades to blend virtual content with the surrounding reality. While this technology itself isn’t new, applying it in a real-time mixed reality setting is novel – especially with the photorealistic quality that is only achieved with Varjo’s mixed reality headset.

What chroma keying technology allows the users to do is assign any one color of choice as the canvas on which the virtual reality scenario will appear. Users can then choose different colors for different outputs. For example, you can select the color blue for all the dynamic movements, while using green to showcase the more static objects in a mixed reality setting. Varjo currently supports up to four different colors at the same time, which are also adjustable. In practice, this means users can set chroma and thresholds for each of the colors separately via the API or the command line tool. This way, you can pick e.g. normal floor and ceiling color as one and then have a proper green screen as another.

Chroma key and visual marker tracking give users a true mixed reality experience where real-world elements are blended within the virtual environment, enabling collaboration in both worlds as naturally as sitting together in your brick-and-mortar office. 

 

 

Visual markers for tracking virtual objects

Visual markers are a few decades-old technology, and are typically used in augmented reality applications. At Varjo, we’re using existing libraries that let you track markers, enabling you to place and anchor virtual reality interactions on top of them or next to them.  By using visual markers with the Varjo headset, professional users can place a virtual object in a pixel-perfect, fixed location. This accuracy makes it easy to replace real-life physical objects – such as replicas or prototype parts – with interactive virtual content in the mixed reality setting.  

Varjo’s easy-access and convenient API gives you access to the identity of the marker. Our markers support individual tracking of up to 1,000 objects, without the need to use active controllers or tracking pucks. This support makes building interactions on top of them as easy as possible. Varjo currently provides this implementation for Unreal, Unity, and native SDK, which saves users from utilizing the lower-level libraries. 

Varjo currently supports three different sizes of markers

  • 25mm marker, with an active area of up to 1 meter (3 feet). Best usability within 0.5 meters. These small markers are excellent for tracking something in your hand.
  • 50mm marker, with an active area of up to 2 meters (6.5 feet). Best usability within 1 meters. These are the most stable options for interactions within arms reach.
  • 150mm markers, with an active area of up to 5 meters (16 feet). Best usability within 2 meters for dynamic movement. Beyond 3 meters, we recommend setting a static position or using multiple markers to stabilise.

Varjo Marker example code and markers are downloadable from our Developer Portal.

Varjo markers are cost-effective replacements for electronic trackers (for example Vive Trackers).

A game-changer for design work

For designers and engineers, combining chroma key with visual markers is a game-changer. With the latest update, Varjo users are now able to create digital twins of the real-life objects they want to include in a mixed reality scenery. 

Users can 3D print any object with the colors that Varjo’s chroma markers support. The combination of chroma keying technology and visual markers enables users to hold the 3D printed objects, like phones or other gadgets, in their hand with perfect occlusion and “dress” them instantly to look like a material-finished product. 

To make designing even more intuitive and life-like, Varjo’s light model enables accurate reflections of the real world around the virtual objects.  The object follows the chroma visual markers perfectly through the Varjo headset, making it possible to visualize and re-design prototypes in seconds, without the need to actually build a physical prototype. 

 

Track objects accurately using visual markers

Our latest software release introduces visual marker-based object tracking. Using visual markers and the Varjo headset, professionals can anchor virtual objects exactly where they want them in their surroundings. This allows the exact positioning of virtual displays, controls or other objects to be fixed in the reality around you. Varjo markers support individual tracking of up to 1,000 objects, without the need to use active controllers or tracking pucks.

Full body presence by a perfect occlusion of the virtual

Chroma can place individual physical things in the virtual experience – we are talking about tools, equipment, control sticks, etc. To understand how this works, let’s think about pilots in training – flying a plane is much more than just knowing the ins and outs of the cockpit. To fly a real aircraft, pilots must develop non-visual muscle memory of the location and operation of certain controls, such as a throttle, in relation to their position in the cockpit. This is critical know-how for any pilot and flight mission. 

With chroma key markers, you can still rely on the cost-efficient virtual scenery currently used in training simulations, but can now mix physical objects in the scenery as well to bring the experience as close as possible to a real-life simulation experience. For example, you can use chroma key markers to virtualize a physical, real-life control stick in the mixed reality setting of the simulation. This means that the pilot-in-training can experience the tactile sensation of using the control stick, developing the much-needed muscle memory, while operating a fully simulated aircraft.

Chroma keying enhanced virtual scenery provides the training pilot with a sensation of immersion like never before, even including scenarios where one needs side-by-side visibility of other pilots or aircraft within the simulation.

“It’s not only a new way of working together more efficiently, but it also allows you to express yourself with body language and movement, just like you would in real life.

Urho Konttori,
Co-founder & Chief Product Officer at Varjo

Perfect connection between people through virtual interaction

With marker tracking, you can share objects with other users inside the collaborative virtual reality spaces. You can also share the coordinate system with other users while working on tasks – the coordinate system stays the same for everyone in the scenery as markers are “glued” to certain points. 

To support the collaborative setup, users can set multiple markers in place to make the coordinate system more reliable. Marker tracking also enables professional users to scale up the coordinate space significantly with accuracy and precision of millimeters.  

With chroma key and marker tracking, you can share a space together with other users and see each other physically at the same time, in real time. It’s not only a new way of working together more efficiently, but it also enables you to express yourself with body language and movement, just as you would in real life

When collaborating over design settings with others through chroma key, you are able to experience the real deal. Compared to the isolated nature of virtual reality, mixed reality with chroma key and visual markers offers a ground-breaking shared experience where you can interact naturally with your colleagues and clients on any projects, reviewing and discussing in real-time.

Here you can find the Q&A to the unanswered questions.

How to blend virtual and real – Answers to the Q&A

QUESTIONS ABOUT CHROMA KEY

 

Q. Can early access users use Chroma Key features through the APIs?

A. Chroma Key is part of Varjo’s Mixed Reality C++ API that was shared with all the Varjo users as part of the software release v2.2. The API provides functionality for managing chroma key configurations and to enable/disable the feature for specific client layers. ChromaKeyTool is an experimental helper tool for setting up chroma key configurations. It can be provided for selected early access partners. Unreal and Unity plugins and Varjo UI support will be added in the following releases. API documentation is available here.

Q. Do the camera images pass through the computer before they get to the display, or is all chroma-keying and warping taking place on the headset itself? 

A. Chroma keying happens in our video-see-through post-processing and compositor pipeline on PC. The path is the same regardless of whether the chroma key feature is enabled or not.

Q. Can you explain the “green flakes” seen in the Chroma Key demo?

A. As a technique, chroma keying requires a controlled environment and lighting. “Green flakes” seen in the live demo are due to the natural lighting in the room and Varjo mixed reality cameras set to the automatic mode trying to adapt to the varying conditions. This leads to some false metering when the camera sees only the green screen surface. Optimal set up should have dedicated lights to create an even light-scape, fine tuned parameters for the environment and camera settings locked to prevent white balance changing over time. We are working on improvements to our chroma key algorithm for automatically adjusting it to the camera settings.

Q. Can I use my own code here for creative control of the image blending?

A. This is something we are currently working on, stay tuned for future updates.

Q. Will Varjo’s headsets offer a 360 video player that takes advantage of Chroma Key by compositing the viewer’s body into the live-action experience? If so, will it support stereo or only mono?

A. While Varjo doesn’t provide a 360 video player, it’s possible to create one using the C++ API or any of the supported engines, such as Unity or Unreal. Alternatively, one can use a video player from the Steam store.

Q. Is there a method for adjusting chroma settings (instead of random nudging)? 

A. Varjo will provide a user interface for adjusting the settings in the upcoming software updates. That will be based on Varjo’s experimental ChromaKeyTool utility that allows the user to pick base parameters from the camera image by pointing the headset towards the green screen. Parameters can then be fine-tuned manually.

Q. Can you explain the Chroma Key occlusion? Is there also occlusion for real hands and virtual objects?

A. Chroma key masking is based on color matching – it does not recognize real hands in any way. The occlusion is based on the fact that human skin color is a different color than the chroma backdrop, which is usually green or blue as it is a color that is furthest away from the human skin tone.

Q. Are the real-world included objects projected in mono or stereo in the chroma keyed environment? I understand the cameras are stereo but what about the image?

A. Everything is projected in stereo. The high-resolution focus area has a higher resolution chroma key mask. If eye tracking is enabled and calibrated, that high-resolution area will also follow your gaze.

Q. What is the difference between Depth masking vs. Stencil masking vs. Chroma?

A. All of the above mentioned can be used to solve the same problem. Depth masking requires that VR is positioned in the physical free space (within room walls), and hands occlude VR content when they are closer to the HMD than the VR content. For hand occlusion, chroma keying usually provides higher resolution and more stable silhouettes. Stencil masking (using a model of the real environment in VR rendering where alpha is rendered as 0) allows the occlusion of static objects, e.g., a plane cockpit, but does not solve hand occlusion when hands are brought over VR content (such as scenery seen through cockpit windows). For more details, please have a look at our blog post discussing the masking issue.

Q. What’s the max distance between HMD and Chroma?

A. There are no limitations to the distance between HMD and Chroma. The only thing that is relevant in a set-up that is that the camera needs to see the color.

 

QUESTIONS ABOUT VISUAL MARKERS

 

Q. What’s the longest distance to track markers?

A. Below is the list of different sized markers and their recommended maximum distance from the headset when the marker is being tracked:

25mm (~1”) Best in distances up to 0.5 meter (1.5 feet)
50mm (~2”) Best in distances up to 1 meter (3 feet)
150mm (~6”) Best in distances up to 3 meters (9 feet)

Q. Can I use custom markers? For example, a giant 4-meter high marker. Can you use different sized markers at the same time? 

A. The custom markers are not supported currently.

Q. Can you use the Vive pucks instead of print markers?

A. Vive pucks can be used through OpenVR API.

Q. Will there be a possibility of using the hand to interact with a marker that has a virtual screen put on top of it or the hand will interfere with the marker?

A. The marker keeps its last known location when obstructed. The timeout of an inactive marker can be changed to a desired length of time using the native API.

 

QUESTIONS ABOUT BOTH FEATURES

 

Q. Can this be integrated with OpenVR?

A. Chroma keying is supported only in Varjo native C++ API at the moment. However, using our experimental ChromaKeyTool utility, we can force global override mode that will work in every application, no matter if it is OpenVR, Unity, Unreal, or Varjo native app. It will work even if it is a pure VR application without any mixed reality support. Visual Markers are accessible through the native C++ API only at the moment.

Q. Was there hand occlusion with your marker tracker combined with the green screen?

A. At the beginning of the Visual Marker demo (see timestamp 35:20 on webinar recording), Varjo’s co-founder Urho Konttori brought a PacMan Arcade machine against the green canvas. Then hand occlusion was disabled for showing other marker tracked objects in a non-green environment.

Q. Can you describe the interaction with the built-in eye-tracking?

A. Chroma key mask is evaluated on a higher resolution in the focus area. Our foveated video-see-through follows the user’s gaze and if the eye-tracking is enabled, the user will always see the sharper mask where the gaze is pointed. This enhances the visual experience significantly.

Q. What is the cause of the slight “swimming” visible between virtual and real objects? Are the latencies of the virtual images and camera images equal?

A. Typically VST latency is lower than VR latency; however, we correct this by doing positional and rotational timewarp to align both views. The tracking system (e.g., SteamVR) may introduce drifting that is more apparent in MR scenes than full VR scenes; however, one solution to this problem is to use Visual Markers, which by definition, align VR positioning with the visual VST image.

 

QUESTIONS ABOUT VARJO MIXED REALITY

 

Q. If I have two Varjo headsets, I would love to feed the camera from one headset to the display of the other, and vice-versa. Does your system have this kind of flexibility?

A. While we do not have such a feature built-in, a client application can read the VST video stream and send it over the network to another client, who then renders this as VR content. It should be noted that in this case, the latency would be much higher.

Q. How is the latency from front-facing cameras to rendering? Is it dizzying if you move your head around? Is there some sort of re-projection?

A. We are achieving 15ms photon-to-photon latency when measured from the end-of-exposure-time of VST cameras to photons coming out from the displays. From camera to rendering, the latency is lower at roughly 9ms. This latency is low enough for most people not to be able to notice at all; however, we further reduce the apparent latency by performing render-time rotational timewarp to eliminate lag originating from turning your head.

Q. Are your cameras capturing the real world in stereo or mono?

A. The cameras are capturing the real world in stereo.

 

QUESTIONS ABOUT FUTURE OF THE PRODUCTS

 

Q. Do you have a wireless version coming in the future?

A. As all product companies, we are always developing new things in our labs, but we do not comment on future products ahead of time.

Q. Are Varjo’s libraries stand-alone for ensuring future applications to be compatible not only with the next Varjo generation but also with other headsets?

A. Varjo APIs have been made to be optimal for Varjo products. We do not comment on future developments, be them hardware or software-related. 

 

You can find the rest of the Q&A by watching the recording below.

 

Watch webinar

 

 

Get started with the new mixed reality features

With our latest software update, merging synthetic and analogue realities becomes easier than ever. It becomes almost impossible to tell where the real world ends and virtual reality begins.

Both chroma keying and marker tracking are now available in early access to all users of Varjo mixed reality. In our software release 2.2, chroma keying is available through Varjo’s native SDK. See our documentation and start blending real and virtual seamlessly.

Download the latest Varjo Base

See developer docs

Read more about Varjo:

Varjo Influencer

Apply to the Varjo Influencer Program

Company News

Apr 17, 2023

Mixed Reality at IITSEC

Demo Award-Winning VR and Mixed Reality at IITSEC 2022

Events

Nov 2, 2022

Mixed Reality XR Automotive Design

Case Phiaro: Meet the World’s Most Versatile, Drivable Concept Car

Automotive

Sep 20, 2021

Organizations

XR-4 and XR-4 Focal Edition are available in our web store and through selected Varjo resellers to business customers in nearly 40 countries.

Individuals

Private customers can order Aero through our selected Varjo resellers. Shipping to EU, Canada, Iceland, Norway, Switzerland, United Kingdom and United States.

Organizations

Business customers have access to our full product range.

Individuals

Private customers can order Aero through our selected Varjo resellers.