Calling all academic institutions: Special offer on the Varjo XR-4 Series available until Nov 30. Apply now.

Talk to sales

Eye tracking: Your questions answered

March 17, 2022
by Oleksii Shtankevych, Computational Vision Developer, Varjo
|
XR/VR generic, XR/VR tips and tricks

In March of 2022, Varjo hosted an insight session to cover the details and possibilities that integrated eye-tracking brings to XR and VR technology; covering behavioural analytics and even new ways to interact with virtual content. The session was hosted by Varjo’s Lasse Tuominen and Ferhat Sen, and I was able to join to give input to the Q & A section at the end.

If you didn’t catch the live session, you can still have a look at the full recording. It includes demos, a walkthrough, and real-life examples of how Varjo’s eye tracking is being implemented in the field. I really recommend that you give it a watch!

There were so many fantastic questions during the session that we were unable to cover them all, so we have done our best to cover the majority of the additional questions below. If you have more questions though, or we missed yours, do not hesitate to get in touch.

We also have a Varjo Lab section in our Discord server where you can continue the discussion.

 

Here’s a sneak peek at one of the demonstrations (gaze tracking) shown in the full webinar

Calibration related questions

 

Is it possible to do the 1-dot calibration inside of an external program?
Calibration done in Varjo base will carry over into the use of other software.

Does “never calibrate” have any effect on frame rate or other performance indicators? Or does it only shorten the time from putting on the headset to seeing the VR content?
In “never calibrate” mode, eye tracking calibration is not auto-triggered and the foveation area remains locked to the center of the displays – i.e. not following the user’s gaze direction. CPU load can be slightly smaller in “never calibrate” mode.

Is there a calibration process that includes more than 5 dots? If so, do you have any data on how much it improves accuracy compared to 1 and 5 dot calibration?
Within Varjo Base there is only the option for 1-dot or 5-dot calibration, but through the API there is the option for the 10-dot “legacy” calibration. 1 and 5-dot calibrations are calibrations that use observations from the current user as well as statistical priors generated from diverse, previously collected and processed observations from other people. Legacy 10-dot calibration takes longer and only uses observations from the current user to train the eye-tracking algorithm. Legacy calibration can be appropriate, for example, in medical research use cases where the use of statistical priors may not be advisable. Although the 1-dot calibration is enough for most scenarios, the accuracy scales with the number of dots used, so 1-dot is the least accurate and 5 or 10 dot is the most. 5 dot is the most accurate calibration for regular users where statistical priors are well applied.

What is the accuracy difference between 5 dots and 1 dot calibration?
5-dot is more accurate than 1-dot, although 1-dot is enough for most typical scenarios. The amount of extra accuracy brought in by applying 5-dot vs 1-dot calibration is user-specific. E.g. 5-dot calibration can be a better choice for some eyeglasses users with rare lens parameters.

Can we get the 10-dot calibration as an option in the Varjo Base?
It is currently only an option via the API, but your feedback has been taken on board!

 

Data related questions

 

Does the Varjo API expose raw eye-tracking data? Will sampling rates >>200Hz be supported in the future? That would open up the potential use of the data for medical/clinical and/or research purposes.
By default, produced eye rays are smoothed, but it is possible to disable the smooth filter via the API to obtain raw eye rays to be used e.g. in medical research. The sampling rate is limited by the eye tracking camera frame rate that is currently used in the headset. We cannot comment yet on the future models of Varjo headsets.

Can I somehow enable the log info (shown on top of the slide) for a VR-3 in Varjo base software?
Yes, VR-3 also has an integrated Varjo eye tracker. You can use the Gaze Data Logging feature of Varjo Base also with VR-3. Here you can find more information.

Hello from Cambridge, MA! According to the Varjo documentation  “The value for the size of the pupil between 0 and 1 calculated according to the pupil size range detected by the Varjo headset.” It doesn’t give you an objective measurement in mm. This has been our experience as well, and it’s been a problem for our research because it makes it difficult to publish and compare results. Are there plans to change this and provide a mm measurement?
We answered this live but I would like to add that starting from Varjo Base 3.5 and the related Varjo SDK, eye measurement estimates including pupil and iris diameter are provided in millimetres. More info here.

Could you show processed data? What can we interpret from them?
I really recommend that you watch the latter half of the insight session where we go through a good set of real-life examples.

Can the eye-gaze data be streamed using for example something like MQTT/MQ?
There is no built-in feature for streaming eye-gaze to MQTT or MQ. However, you can develop a simple application using Varjo API to access the eye-gaze data in real-time and stream it according to your requirements.

You say that the Unity SDK is more extensive than the XR_EXT_eye_gaze_interaction (which would be used when integrating with Unreal) – Can you highlight what you would think are the key extra parameters available to Unity compared with Unreal?
Individual eye rays (left and right), pupil and iris diameters, IPD estimate, also possibility to disable smoothing filter for getting raw output and possibility to trigger calibration through API.

Since OpenXR for Unreal seem more limited in data, how can we access the additional data within Unreal in an easy fashion? Could there be an updated version of the Varjoy SDK for unreal taht will work wtih Unreal 4.27 or 5.0?
We are considering feasible options to extend data access for Unreal and OpenXR integrations. One “faster way” might be introducing Varjo specific custom OpenXR extension to expose additional eye tracking data (faster because it takes more time to agree on specifications with other vendors about cross-vendor OpenXR extensions rather than Varjo-specific). This is to be decided.

Are all these features supported in OpenXR specification?
There is only one cross vendor eye tracking extension in OpenXR specification now – XR_EXT_eye_gaze_interaction. It provides support for accessing only combined gaze ray and focus position.

What kind of functionality can be accessed via the Unity plugin? Are we only able to access the data from the eye tracker or can we also change the frequency of eye tracking in real-time, enable/disable it, etc.? Also, what is the real-time performance cost of using/not using eye trackers inside the application?
Please check our documentation for Unity XR SDK plugin. It is possible to control the data stream parameters, e.g. reduce the frequency of the eye tracking data stream to an external application from 200Hz to 100Hz via API, but it will not have an impact on CPU or GPU load in Varjo Base since it does not impact eye tracking camera frame rate and internal eye tracker processing. It is not possible to disable the eye tracker.

Does the Eye Tracking data contain the distance to focus point?
Yes, focusDistance returns the distance between eye and focus point. It has a value between 0.0 and 2.0 meters.

 

To hear Ferhat talk through this guided walkthrough, check out the full recording of the insight session

 

Eye tracking in speciality cases related questions

 

Eye tracking and IPD with glasses / spectacles?
Glasses can reduce the accuracy of eye tracking data, but it is still possible, so we recommend glasses users to continue wearing them within Varjo HMDs. Thick or uncommon eyeglasses frame shapes may affect/ block the all-important internal LEDs a little bit more than thin frames.

What about diopter adjustment?
Varjo HMDs do not currently feature diopter adjustment but they are designed to allow users to continue wearing glasses within the headset itself.

For someone who has an eye disease, for one eye, that cannot focus, would that cause problems for the unit? Are the calculations an average of both or one dominant?
Varjo utilizes a proprietary algorithm for combining left and right eye tracker output to produce a combined gaze ray. It is much more advanced than just the computing average of the two. However, irregular eye movement can still impact combined gaze ray.

Is it possible to calibrate via Varjo Base even if GetGazeCalibrationQuality() returns ‘Invalid’ for one of the eyes?
‘Invalid’ status is currently returned due to not done calibration or failed calibration. For now, there is a limitation that calibration can succeed only if both eyes were calibrated.

Hello from America! Will this work if your customers have asymmetric pupils or astigmatism?
Note that each eye is tracked separately. In practice it helps to mitigate the impact on eye tracking by some eye disorders, such as asymmetric pupils. In case of astigmatism users are welcome to use their eyeglasses with the headset.

How does it work for people who need corrective lenses in their goggles?
Users are welcome to wear their eyeglasses with the headset. The Varjo eye tracking algorithm supports it.

 

Eye tracking in Applications/Use Cases

 

Which app can be used to extract eye tracking data?
Eye tracking data can be dumped by Varjo Base (see gaze data logging tool), or by a user-written application with help of the Varjo public API (for more information check out our developer portal)

Never worked with Varjo before, just wondering which are the tools available to synchronize the device with others (for research purposes and very accurately <1ms)?
Eye tracking data is supplied with accurate timestamps in nanoseconds (more info here). Synchronizing Varjo timestamps from the PC clock with timestamps from other sources & devices can be done in a user application.

Is the eyetracking possible with DCS World?
Eye tracking is done via Varjo Base which runs in the background whilst using other applications, so possible regardless of what software you are running.

I’m told that Major League Baseball uses eye-tracking for marketing purposes, to give advertisers the best placement at the stadium for ads. Do you know how this is done? Do the fans know they’re being recorded?
It likely is done by providing a select group of the audience with eye tracking enabled glasses, so they would be fully aware that the data is being recorded.

Do you offer a special discount for University research lab buying Varjo headsets?
Yes absolutely, you can find more information about the academic pricing for Varjo here.

 

Other Questions

 

What does Varjo mean?
Varjo is a Finnish word, translating directly as ‘shade’, but we like to think that it is now synonymous for the best in XR and VR technology 🙂

Does eye tracking have the same accuracy independent of eye color/iris color?
Eye color impacts appearance of the iris in the IR eye camera image and thus impacts contrast/edge between pupil and iris. Accurately tracking of a pupil on top of a darker iris might be more challenging, however Varjo eye tracker algorithm was developed and tested to perform well with different eye colors.

Does the Varjo Aero have the same eye tracking technology as your other headsets?
All Varjo headsets include internal eye cameras for eye tracking.

Hi from Germany – is this included in all Varjo Headsets?
Yes! Eye tracking is available in all of our current hardware lineup (XR-3, VR-3 and Aero)

Are all these features available across all Varjo devices? Varjo VR-2 specifically. like the saving calibration feature.
Foveated rendering and foveated rendering calibration options are not available on older Varjo headsets (VR-2, VR-1, XR-1)

What happens to the usage after the Varjo VR-3 or XR-3 subscription period is over?
The Aero has no subscription requirement, and to get more information about the enterprise subscription for our other headsets, I would recommend checking out this page and reaching out to the team there if you have any more questions.

So it measures pupil dilation as well as direction, right?
The primary function of eye tracking is to provide information on where user is looking at in VR/XR scene (direction and focus position). Pupil dilation is provided by Varjo as additional measurement data for research and social VR use cases.

What is the current latency of the eye-tracking and how does this limit what the capabilities are of the system? How will this change in the future?
Current latency can be measured, the timestamps are in the exported data, so you have all possibilities to measure the latency on your system. Usually, it takes around 12 to 15 milliseconds from start of image recording in eye camera to getting eye tracking data through the native API to your application on a typical compliant PC.

Am I dreaming, but did he just said that the auto-IPD allows to see clearly object that are very close?
Correct, IPD adjustment matching user eyes helps to get better visual clarity in headset at any distance.

How does the eye tracking algorithm work? Is it a neural network? what methods were used to train it?
Sorry, I can’t go into detail here as the Varjo IP is under NDA.

For research purposes, it is important to know the details of certain parts of the eye tracking algorithms used, for example the basic algorithm to classify eye movements into fixations, saccades and other features. Are these information available for researchers?
Information on implementation details of the algorithm is not available for the public at this time.

The new pupil tracking API doesn’t appear to currently work properly with the experimental distortion correction. What progress has Varjo made on addressing this?
Support for accurate eye tracking with improved experimental visual distortion profile (introduced with Varjo Base 3.5) is currently being worked on.

Is the combined gaze ray from the point between the eyes?
Correct, origin of combined gaze ray is the point in the middle between the eyes.

Are there different gaze filters available to define if a gaze is a saccade or fixation and is it possible to create own filters?
We provide stability measurements (read more here ). Users are welcome to develop their own filters by analyzing/processing eye rays information.

What’s the max refresh rate for the eye capture?
200Hz

Are there any considerable performance differences between the mentioned APIs?
Performance/latency differences between different eye tracking APIs are marginal.

We’ve found a fair amount of accuracy variability across people. Is there a way to improve eye gaze accuracy when used as an interaction modality with elements in VR?
For best accuracy, we suggest using 5-dot calibration and to pay attention to headset alignment. The headset must not be rotated on the head – eyes alignment can be checked with eye camera preview in Varjo Base. Eye relief (distance from eye to headset lens) can impact accuracy: do not tighten the headset too much to not have eye relief smaller than 12mm. Best eye tracking performance can be achieved with eye relief ~15mm. Here is an article with more information on how to achieve optimal headset placement.

I assume the Varjo marketing term “Human Eye Resolution” for the XR-3 only applies to the virtual focus display, but NOT to the Mixed Reality Camera resolution. Correct? What is the resolution of the XR-3 camera image?
The XR-3 has 12-megapixel, low latency video passthrough.

Do developers have access to the eye cam?
We have this API feature request in our backlog. We will discuss and decide whether we will do it. We will count your question as a vote for it.

When we have more clients using XR-3, how is it with hygiene? What do you recommend?
Here is some helpful advice on how best to handle hygiene with Varjo headsets.

 

Looking ahead – Varjo’s future plans related questions

 

Does Varjo plan to track eye openness as part of its eye-tracking? If so, when does Varjo plan to implement it?
Eye openness is in our backlog, stay tuned, it will appear!

When will we see all of these extra pieces of data accessible within Unreal?
We are considering feasible options to extend data access for Unreal and OpenXR integrations. One “faster way” might be introducing Varjo specific custom OpenXR extension to expose additional eye tracking data (faster because it takes more time to agree on specifications with other vendors about cross-vendor OpenXR extensions rather than Varjo-specific). This is to be decided.

Any plans on a standalone Varjo headset?
I recommend following Varjo’s social media channels to be kept up to date on any new product information or announcements. 

Have you guys considered any 5G use cases? Can these devices be USB tethered to a 5G phone for internet connection?
To push the boundaries of XR and VR, there is testing going on with all kinds of technology at Varjo. No stone is left unturned in the quest to drive the cutting edge of immersive computing.

Will this tech go into all Varjo tech/form factors?
Eye tracking has a lot of current applications from foveated rendering to behavioural analysis and will form an integral part of future functionality, so I imagine it will be included as standard in all future HMDs.

What can be expected long term with future Varjo HMDs?
Really incredible things. I wish I could say more, but I will simply say; stay tuned.

Could you share a sneak-peek into the roadmap for future updates and features that will be or are planning to be integrated?
I can recommend checking out other articles in this Varjo Lab Blog, as they give a good inside look at work going on behind-the-scenes at the  Varjo office.

Browse more Varjo Insider posts

Talk to sales