Varjo Reality Cloud Launch Event: Your Questions Answered
We got so many questions during our Varjo Reality Cloud launch event that there wasn’t enough time to go through all of them during the live session. Below you can find answers to the remaining questions from Varjo’s co-founder and CTO Urho Konttori.
Get the full recording of the launch event here.
All the options mentioned are tethered. Does the cloud solution only work with those devices?
Right now it works only with Varjo headsets. We will be opening up more devices soon as mentioned. Both 2D devices like tables and immersive devices like headsets.
Is support for webXR and webXR development tools available or planned? Also do Varjo HMDs support webXR browsers (which)?
We support Chrome and Edge. Please contact our Head of Product, Mat Zagaynov, to discuss your case in more detail.
I have a technical question regarding development methodology for Varjo Cloud…
Please contact mat email@example.com.
And if so, what would the OS requirements be to use it ? – I am currently running a Macbook Pro Retina with Bootcamp, so I have access to both Mac OS & Windows
Currently you need a Windows laptop/desktop to connect the Varjo headset to. When more devices are opened, this will be changing.
Do you have any way of having “holographic” (eg volume captured) participation, rather than avatars?
We are very keen on this topic and you can learn more about our vision from the 2021 Varjo Reality Cloud vision video.
Whats are the client specs where the Varjo headset is connected?
Desktop / Laptop with RTX 2060 / 3060 or above. Key requirement is support of DP 1.4 with DSC to actually drive the displays in Varjo headsets. When more devices are added, this will be changing for those devices.
The headset seems to be capturing the people in the room. I assume the capture is stereoscopic. Can it be recorded in stereo? Can the room with the model and the people also be captured by a stereoscopic camera not on an HMD (not worn by a participant) ?
Yes, your point of view, including the model can be recorded.
hey are there any plans for expanding the aero line and how long will the current aero be supported?
We don’t comment on future devices, but Aero will have full support for years.
Anything relating to Varjo Aero?
You can absolutely run Varjo Reality Cloud with Varjo Aero.
Will there be a portable client or ar Varjo headset with a built in cloud client?
Perhaps! We kind of hinted there might. Some day.
How is the headset connected locally? Do you still need a high powered GPU on the local computer? The XR-3 for instance needs 2x DP connections?
Yes, XR-3 still requires 2 DP connections.
When could we have a standalone headset with 5G connectivity ?
In the future!
Does it support the 3D masking features ?
You can use local masking tools with Varjo Lab Tools today and more features will be coming in the future. Please contact firstname.lastname@example.org about your use case.
Is it possible to share live 3d-content, say from a Azure Kinect system in a session?
If you use an app that provides that (e.g. Unity/Unreal).
Are there short-term plans for interactivity features, animation support?
Autodesk VRED supports animations already. Most of the visualisation apps do as well. Unity and Unreal are great for building interactive solutions.
Do you have any plans to eliminate the thin laptop and have a full untethered headset connected to 4G, 5G?
Thank you! amazing. you may have mentioned earlier, but wondering if there’s any info you can offer on a desktop/mobile experience to complement those in VR?
With desktop, you would have ability see in 2D screen equivalent content as on headset. You could move e.g. with WASD+mouse. Tablets can use ARKIT for ability to talk around as well. More details on these later.
What are the VR Headset recommended specs for OTHER brands of VR Headset, e.g., HP Microsoft (just in case…)
Will be revealed later, but they will be either the minimum or recommended specs of the other brands. We require a bit of specific video decode capability which might not exist in some older GPUs.
Question: How is it different from other pixel streaming services?
Human eye resolution in fully immersive enviroment, streaming at 90 FPS and the full compute and service built around it to make it easy to use.
Why not using Microsoft Azure remote rendering ?
ARR is object rendering only, and optimized for looking at still relatively simple polygonal models in augmented reality. It’s also mainly an SDK and not service. Also compute is nowhere near AWS elastic cloud.
Can you wear boots while using the headset?
Probably a stupid question, but how do you see other users in a multiuser environment (do you have avatars)?
VRED has avatars. Other services may have other means of showing people, but more on those later.
Are you planning to implement webXR for Microsoft Edge?
It already works, but we don’t have the service to access it. Please contact our Head of Product email@example.com about your use case.
Most cloud gaming services have problems with video synchronization (between cloud image generation and local playback), resulting in sporadic frame skipping, stepping and/or image tearing. A workaround is often to increase buffering on the client side. How is this handled with your solution?
We have very strict time stamps and we actually steam 3D voxels, so we can always do a relatively good late latching / time warp to keep experience solid.
I understand cloudXR is an SDK but why not use it as your foundation, what is technically supperior in your core implementation ?
Human-eye resolution and XR feature set.
I am currently looking for a laptop that allows me to develop and run an XR-3 experience. How soon can we purchase the new A10 card in a laptop. If that answer is now, where can it be purchased? Also, what configuration do you recommend and which laptop brand?
Immediately through Varjo Reality Cloud! Seriously, I would not purchase A10 for laptop. Nvidia has 3080 available for laptop and it’s meant for laptop use whereas A10 is mainly for cloud use. I’ve been using Lenovo Legion 16” 3080 as my home laptop and can say it’s outstanding. Asys Zephyros 15.6” with 3080 we’ve used for XR-3 demos a lot recently. Schenker and Orion have very powerful, but a bit heavier laptops which also come with 3080 and are very solid performers.
ETA for Quest2 support?
No dates to commit to yet.
Is eye-tracking available in the cloud?
This was answered during the webinar, but not clearly. Yes, eye tracking data is sent to the cloud, but not exposed to the client application through API. Please contact firstname.lastname@example.org with your use case, and … it’s trivial to do, so we’ll the prioritize it up.
Hi I have a hardware connection question: If I use a thin laptop to connect my XR-3 to the Varjo cloud, how do I connect teh headset to the laptop? currently the headset needs 2 DisplayPorts, plus 2 USB-A ports. DOes the laptop need to have these many ports?
Yes, many laptops do provide those ports, even for XR-3. As mentioned above, we’ve been using Asus zephyros 15-6” with 3080 with XR-3 a lot. Also, the laptop adapter makes life easier and cleaner with laptops.
Has there been any thoughts on making some sort of full body tracking for headsets?
We love that domain, but it’s a bit unrelated question, so for different time.
How do the avatars in the multi session look like?
You can adapt those, but the default looks a bit like crash test dummy and Eva from Wall-e.
Knowing what technology we have access to these days, like the reality cloud and the immersive headsets from Varjo, could you predict if / when would we have them integrated as a workflow standard even in traditional industries?
Define traditional. But we see more and more traction to architecture and construction, mainly driven by the increasingly easier-to-use visualization tools. When we consider the immersive market expanding to around 100M units by the end of 2025, that’s the first time I see realistic to talk about moving out radically from the usual comfort zone of use cases of immersive tech. It’s still early years.
I’m an independant consultant without a team to collaborate with at the moment, but I would like to demo the Aero to my clients. Is there a demo scenario I could login to, in order to showcase this offering in real time?
Please contact our Head of Product email@example.com.
What is the average roundtrip latency in milliseconds (input to pixels) if you are near (say ~30 miles) from a physical cloud server location with a strong link? and is some form of spacewarping planned to minimise perceived latency? (and if so what specific technique does Varjo Cloud use)
Less than 1 ms for the data transfer if you have decent routers at your office. Light moves around 200 miles in a millisecond. Data transfer slows down in choke points, like the undersea cable links and the edges of operator networks. Wifi is also surprising big cause of lost milliseconds, with wifi 6 improving on other ways than just more bandwidth.
In addition to multiplayer does it support co-location ?
We can. Please contact firstname.lastname@example.org about your need. It requires a marker in the room to have easy coordinate system sharing.
Does the platform support two-way audio? For example, to allow live chat between XR users during collaboration sessions?
Would it be possible to keep the amazing feel of varjo headsets well bringing down the size of headsets
In future, everything is possible.
Will the slides in the presentation will be posted somewhere?
No, but the video recording is available here.
Is this different from Teleport, if yes, how?
Varjo Reality Cloud is about moving compute to the cloud and enabling new devices to get high performance compute and remove friction from use of immersive applications. Teleport is about digitising the world around an endpoint in real time in human-eye resolution and transporting that to other place in the world in real time to be experienced as if you were in the other location with perfect sights and sounds. More on that in the future.
Have you considered incorporating external tracking sensors for physical props etc.?
We support multiple tracking methods like ART, Optitrack, Polhemus as well as QR marker base tracking of objects. Contact email@example.com about your use case.
Will you be working to natively view CAD models from Dassault Systemes CATIA and Siemens NX. these are two major CAD software used in Aerospace Design
You can pull from both of them to VRED directly, or through Siemens Team center.
Other plugins can also be used through their OpenXR/OpenVR bridges, but … Contact firstname.lastname@example.org about your use case.
What is the Unity/Unreal timeline looking ahead and if I dare ask, Gen4?
Gen4: You can always ask, but we won’t comment.
How quickly can you swap assets while in a session?
Depends on many things, so cannot give a stock answer. Pretty fast typically. And you can pre-upload all the models before starting session.
How about US Saas Cost?
I assume you mean East Coast. Yes, there are servers in North Virginia.
Any exciting independent third party reviews or previews of the cloud tech coming in the near future? (Other than Thrill and Phea)
Plenty of reporters and analysts have seen it. Perhaps we make a collection of links at some point.
Are you working with Operator like Orange? we will start first XR testing 5G Stand Alone network soon
We haven’t yet worked with Orange. Would love to though! Please contact Mat. email@example.com