Beyond Foveation: Evolving the Role of Eye Tracking at Varjo
At Varjo, eye tracking has long played a critical role in unlocking advanced functionality in our XR headsets, from foveated rendering to automatic IPD adjustment. But as XR adoption expands across industries, eye tracking is fast becoming one of the most powerful tools for training, simulation, and assessment.
In this Insider post, we explore how eye tracking is evolving from a behind-the-scenes enabler to a key driver of immersive learning, human performance insight, and operational readiness.
Eye Tracking as a Core Enabler
Historically, eye tracking at Varjo has enabled two primary features:
- Foveated Rendering
A performance-boosting technique that renders only the part of the scene a user is directly looking at in full resolution. This dramatically reduces GPU load while delivering top-tier visual fidelity.
- Automatic IPD Adjustment
By continuously tracking the user’s eyes, the headset can adjust interpupillary distance (IPD) in real time, ensuring each user sees the most accurate and comfortable view possible.
With the XR-4 series, eye tracking isn’t just a feature, it’s foundational. It powers essential headset capabilities such as video pass-through (VST) and auto-focus (in the XR-4 Focus Edition), and is required to deliver the high precision and realism users expect from Varjo.
Unlocking the future of XR training
Beyond enabling headset features, eye tracking is rapidly becoming a cornerstone of XR-based training and skills assessment. Here’s how:
- Objective Review of Attention and Focus
During a training session, eye tracking data can be recorded and analyzed to determine exactly where a trainee looked, and when. This enables instructors to assess if users followed the correct procedures, looked at critical information at the right moments, or overlooked key steps.
- Performance Benchmarking
Comparing expert vs. novice gaze patterns helps identify where trainees need to improve. Did the trainee look at the right indicator during a system failure? Did they check their surroundings during a safety drill? Eye tracking makes this insight measurable.
- Stress and cognitive load assesment
Eye behavior offers rich clues about mental state. Metrics like pupil dilation, blink rate, and fixation duration can be used to infer stress, fatigue, or cognitive overload, providing instructors with data that’s otherwise invisible.
- After-Action Reviews with Visual Playback
Trainees and instructors can review recordings with gaze overlays, visually seeing where attention was focused throughout a session. This reinforces learning, sharpens awareness, and improves retention.
Industries like defense, aviation, manufacturing, and healthcare are already seeing the impact of integrating eye tracking into XR training pipelines.
New in Varjo Base 4.10: Preview Eye Tracking Analytics Extension for OpenXR
As part of our ongoing commitment to enabling deeper insight and integration across XR applications, the latest Varjo Base 4.10 introduces a preview version of the XR_VARJO_eye_tracking_analytics extension for OpenXR.
This new extension allows developers to access high-frequency, frame-accurate eye tracking data in real time, opening the door to powerful analytics use cases such as:
- Tracking gaze patterns during mission-critical training tasks
- Logging physiological indicators like pupil size or eye openness to monitor stress and alertness
- Capturing data streams for detailed post-session debriefing or live performance analysis
With a sampling rate of 200 Hz, the extension provides access to combined and individual eye data including pupil diameter, gaze direction, eye openness, and more. This level of insight can be instrumental for advanced training systems, simulation platforms, and cognitive workload studies.
Currently available as a developer preview, the extension can be enabled manually and is supported in the HelloXR sample included in the Varjo OpenXR SDK. We invite developers to experiment with this new capability and help shape its evolution through feedback and testing.
N.B. As a preview feature, this extension may change in future Varjo Base updates. Instructions for enabling and testing it are available in the Varjo OpenXR SDK documentation.

Expanding Use Cases Across Varjo’s Ecosystem
In addition to training, several other eye tracking applications continue to grow in relevance:
- Natural Interaction
Gaze-based interaction is emerging as a natural complement to hands and controllers, allowing for seamless, intuitive UX design in enterprise applications.
- Social Presence
Visualizing eye movement in multi-user environments enhances realism and connection, making avatars more human and collaboration more engaging.
- Scientific Research & Biometrics
Eye tracking enables behavioral analysis and diagnostics in fields ranging from UX research to neuroscience. A notable example is our collaboration with machineMD, where eye metrics are used to support advanced medical assessments.
- Graphics Enhancement
Advanced visual features such as gaze-contingent depth cues, distortion correction, and chromatic aberration tuning all rely on real-time eye tracking for precision rendering.
The Vision for Eye Tracking at Varjo
Our continued investment in eye tracking is driven by a clear vision:
- Enable world-class foveated rendering and mixed reality (VST)
- Deliver effortless, accurate automatic IPD for every user
- Support natural eye-based interaction in enterprise XR workflows
- Provide actionable insight for training, diagnostics, and human performance analysis
Above all, our eye tracking systems are designed to work reliably across a diverse user base accommodating differences in age, eye shape, skin tone, and eyewear.

Eye tracking is no longer just a behind-the-scenes feature. It’s becoming central to how we learn, train, and perform in XR. And we’re just getting started.
Let us know how you’re using eye tracking in your field, or what you’d like to unlock next.