Varjo Lab Tools: Your questions answered
In December of 2021, John O’Neil (Solutions Architect, Varjo) and I (Samuli Jääskeläinen, Software Engineer, Varjo) hosted a webinar exploring the impressive real-time control that the Varjo Lab Tools application brings to mixed reality setups.
If you missed that webinar, I highly recommend giving it a watch, it includes a great overview and a live demo of the functionality on offer for Varjo users. You can watch it on-demand here.
At the end of the webinar, we hosted a live Q&A session. There were so many fantastic questions that we have compiled answers to some of the those that we couldn’t answer live.
Excess questions from the Varjo Lab Tools webinar
What sort of overhead is there for running this on top of a VR application?
On average. 1-5 % overhead, if the VR app has low framerate, might have a higher impact than applications that are running 90fps comparatively.
Can the mask be reversed? e.g. showing the majority of a virtual environment but bring the physical objects inside?
Yes, for a bit more information about the differences between the extended and restricted masking modes that you are looking for, check out the documentation here.
Can you demonstrate using the extended mask in Use mode while depth occlusion is on?
For a situation like this, I would recommend reaching out to support help in this specific use case.
Video depth testing with near and far is a great feature. Any plans for including Lidar range testing against a model? For example, correct occlusions when grabbing a cockpit control?
No. Not currently supported.
How are the Varjo Lab Tools created masks used in SteamVR, without Lab Tools?
It does work on top of steam VR applications as a separate process, but if you want to use these masks in your application directly you need to use our SDK.
About the video depth test, is there a limitation of object count? How many layers can be recognized?
Depth testing has no object limitation, only a range limitation, I hope this answers your question. In the video depth testing settings, you can change how the depth is mixed with other applications.
Can this lab tool be used for outdoor applications? What are the potentials and challenges?
Optimal lighting conditions produce better results, but it’s possible technically to use outside, but I wouldn’t recommend it (especially if it’s raining!)
When will this masking be added to the Unreal plugin?
You can run Varjo Lab Tools on top of an unreal application, but for further information, you can refer to our Unreal related documentation.
We are having an issue with the pass-through cameras focusing clearly on digital screens. For example, viewing an approach plate on an iPad. Is there a setting in labs to adjust the focal point of the camera to resolve this?
No, it is currently fixed, you should check with support because this sounds a bit different. There is a sharpness tool in the lab tools application which may be able to help you.
Hi, do you have a forum as well in addition to documentation?
Our new Varjo Lab Blog section will soon be featuring a comment section, allowing for discussion around different aspects of mixed and virtual reality.
Will some of the tools be available for the Aero?
The Varjo Lab Tools application is currently focused on mixed reality requiring passthrough cameras which the Aero does not have. However, there may be VR functionality added to the tool in the future which would also apply to the Aero.
Is it possible to invert the mask to render the virtual content outside of it?
Yes, the restricted and extended masking modes invert the mask to show either virtual content or passthrough footage.
In the case of the aircraft cockpit, how would the mask be made?
Following the instructions found in the documentation, the mask of the cockpit can be mapped out with a controller indicating the placement of vertices, or by importing a pre-made mask.
Are there plans to include tracking methods different from ARuco Markers for masking capabilities?
We also currently also support Steam tracking and the associated devices.
Can you talk about the eye-tracking tools you use? Is it your own solution or are you using something like Tobii? Why did you make that choice?
Our eye-tracking is actually really interesting, and I recommend you head to this article for a more in-depth response to your question.
Have you integrated Varjo headsets with Roblox? We have tried some simulations with Roblox and think it very useful for some scenarios
Not yet, Roblox is a steamVR compatible game, so you can use it with Varjo Lab Tools. We are always interested in trying new things so if you want to write about your experiences with Varjo headsets and Roblox, it could make for a very interesting Varjo Labs guest blog post. You can get in touch via lab@varjo.com if you’re interested!
Do you plan to offer any academic licenses for Varjo Base usage?
Academic licenses should include the usage of Varjo Base and Varjo Lab Tools, but be sure to state your intentions when applying to be sure. You can apply here.
Can you tell when multi-app support and chroma key is coming for Unreal?
Sorry, no answer to that yet.
Right now I look for the AprilTag for furniture and as soon as I find it I take the position and change the mask’s position only once and then I stop tracking the tag because the furniture doesn’t move anymore. But the parallax is quite wrong and the mask moves too much when I move my head. Should I keep tracking the tag to fix that?
With a static marker the mask should be in the right position, there is always a bit of minor swimming, but always check that the mask has been has been set at the correct depth. This can be helped by checking that the mask is set correctly from multiple perspectives/ positions around the object whilst creating a mask.
Does your eye tracking tool use IR for tracking? Do you have pupil dilation tracking APIs?
Yes, find more info in the blog post and here is some information on what type of data you can get from the eyes.
For the masks that follow an object’s movement, does that require the use of a marker to track an object properly?
For moving objects you can use either the Varjo marker or a SteamVR ready tracking device.
Is it possible to use other tracking sources as mask anchors with Varjo Lab tools? Eg ART tracking?
No, we do not support Art tracking in the lab tools currently. You can probably anchor to the origin of the Art tracking but you cannot have moving objects.
Will this device work on a high-end laptop?
It is highly recommended to use a desktop, with reference from the following page.
Would the mask be made to cover the outside of the cockpit?
You can do it either way, it’s just a matter of preference.
Do you have an expected time to disable focus display and free space for an additional monitor in the graphics card?
In the latest version of Varjo Base, it is possible to disable the focus displays but this does not yet free up a display port. Your feedback has been taken on board though.
We developed a demo and we used chroma key but we didn’t know the option to extend the virtual content beyond the chroma. How would it be done with Varjo lab tools? just by selecting the extend option?
Yes, selecting extend and creating a polygonal hull around the chroma area, or you could make it the other way around and just create a restricted mask on the chroma.
Do you have plans for a mask marketplace?
No, but you are free to share the masks you are creating with others.
For a future update could we get a multiple-step depth testing so say we want 2 different zones of depth detection ranges?
This is not currently planned but thanks for the feedback, we’re always taking new ideas into consideration.
Can you import higher resolution depth maps from external hardware tools like higher resolution lidar scanners to augment the capabilities of the headset?
Not currently.