Industrial-Strength Eye Tracking in Varjo's headset
One of the most important integrated features of Varjo’s device is the 20/20 Eye Tracker. It is optimized for data analytics and user interactions in the highest possible visual fidelity VR.
What is eye tracking?
Eye tracking (or gaze tracking) is a technology for measuring the gaze direction of the user. When combined with positional head tracking, one can determine the exact point the user is looking at in the real or virtual world.
Eye tracking is already widely utilized in many fields of research, including marketing, psychology, medical, usability, and user behavior. It also has many important use cases in VR/AR and is rapidly becoming a standard feature in high-end headsets. Examples include automatic inter-pupillary distance (IPD) adjustment, detecting when a headset is worn, and automated user detection through iris recognition. It can also be used as a control mechanism in user interfaces, and to improve the realism of social avatars.
As the human eye has a reduced acuity in the peripheral vision, it is possible to utilize eye tracking to improve graphics performance by reducing the number of pixels rendered in areas where the user cannot see them clearly. This technique is known as foveated rendering, and Varjo headsets can also exploit this phenomenon with the human-eye resolution Bionic Display.
Apart from detecting the Point of Interest, an eye tracking system provides information about fixations (whether a user is focusing on something), saccades and other eye motion patterns, blinking, and pupil dilation. Varjo provides a simple API that applications can use to query the relevant information about each eye at any given moment.
The integrated 20/20 Eye Tracker in Varjo HMD can be used for interacting with human-eye resolution VR content in various training scenarios.
How the 20/20 Eye Tracker Works
Video-based eye trackers, such as used in Varjo HMDs, operate on a simple principle. High-speed video cameras are used to record images of the eyes. One or more illuminants are used to produce “glints” that are reflected by the cornea. Using computer vision, the position of the pupil and the glints are extracted. A combination of a mathematical model of the eye and a brief user-specific calibration process allow computing the eye direction from the pupil and glint data. The final accuracy and robustness of the solution is determined by the hardware (camera, illuminator) and software algorithms utilized.
In Varjo’s VR-2 and VR-2 Pro, there are two cameras inside the headset, one for each eye, capturing images of the eyes at 100 frames per second. The sensors are using have a very high resolution (1280×800 pixels) compared to most eye tracking solutions. Varjo headsets do all of the recording in the infrared (IR) spectrum. As the human eye cannot see IR, and the headset is sealed, we can fully control the IR illumination inside it.
What makes Varjo’s 20/20 Eye Tracker highly accurate compared to other eye tracking solutions is the unique approach for its choice of glints. Instead of using traditional dot-shaped ones, our headset has a custom-made, complex IR illumination pattern. The main advantage of having such complex illuminants is that Varjo’s headset can recognize which reflection in the image corresponds with each illuminant by examining its orientation. This significantly increases the accuracy and robustness of the solution, especially at extreme angles and when users are wearing glasses. Data is combined from both eyes to produce the final 3D eye vector estimate. The solution makes the 20/20 Eye Tracker the most accurate eye tracking ever integrated into a headset.
“The most accurate eye tracking ever integrated into a VR device.”
Building the 20/20 Eye Tracker for Varjo
At Varjo, we’ve had the idea of building an industrial-strength, integrated eye tracking solution from day one. That is why eye tracking also ended up influencing the physical headset design. Working together with our mechanics and optics teams, over a hundred prototypes of the hardware were built over the course of two years. 3D printing technologies were heavily utilized – even the lenses were 3D printed samples – which allowed us to execute extremely rapid iteration cycles, often building multiple prototypes per day.
To perfect our eye tracking technology, we set up an extensive human testing laboratory, collecting terabytes of eye video data from hundreds of volunteers of different ages and genders, having a variety of skin and eye colors, make-ups and eye conditions. This data has been used to fine-tune both the hardware design and algorithms. We have also built artificial robotic eyes which can produce eye patterns 24/7 – without getting tired – as well as eye simulators.
Extensive testing and iterations have enabled us to reach a sub-degree accuracy in the 20/20 Eye Tracker, which is equivalent to recognizing your fingernail from a distance of an arm’s length. Combined with the unique visual fidelity, Varjo’s headset allows users to interact with human-eye resolution VR content and take advantage of tracking and analyzing eye movements in true-to-life human-eye resolution VR. It works just as precisely and accurately even when worn with glasses, and the API can easily be used with existing solutions. We also offer plugins and example code for Unity and Unreal engines.
Eye tracking can be used for interacting with content: you can use it for selecting objects or prompting additional information about a specific object by simply looking at it.
The road ahead
With Varjo’s VR headsets and the 20/20 Eye Tracker now available, research and development on eye tracking continue at Varjo. In the future, we’ll continue optimizing the 20/20 Eye Tracker for the most demanding professional use cases. We aim to increase the sampling frequency of the 20/20 Eye Tracker, provide saccade landing point prediction, and enable more sophisticated analysis of the data, making the use of eye tracking even easier for application developers.
If you’re interested in hearing more about the 20/20 Eye Tracker or getting a demo, contact us.