Upgrade offer for XR-3/VR-3 owners available. Learn more.

Talk to sales

“Everything will change.”

“The stars are starting to align and we’re riding a mammoth tech wave right now.”

I’ve been coding since I was 6 and a professional programmer since I was 16. My background is in relative computer graphics and computer vision, and I still code every day.

I was a math prodigy when I was a kid. When my family got a computer in 1986 (a 286 EGA) I basically hoarded it and started coding.

Between 2006 and 2016 the industry was really boring. All we really saw was faster graphics and minor updates. And then all of a sudden things started to get interesting again with mixed reality and augmented reality.

Suddenly there was space for smaller companies and really smart people to innovate and really change how graphics are done. And that’s where we come in.

“We’re already close to perfection.”

I think it’s time that we change our rendering paradigms completely and get away from polygon rasterization, which has been the dominant method for the past 25 years. It’s finally time for real-time ray tracing, and the big players in the industry, like NVIDIA, are now stepping up and pushing that in 2018.

It’s super interesting that at the exact same time that we’re building display technology that’s able to jump to human-eye resolution there are parallel technologies being developed that we need but don’t have the manpower to create. It’s like the stars are all suddenly aligning – the timing couldn’t be more perfect.

Human-eye resolution, which is our variant of foveated rendering, essentially sets an upper pixel per frame boundary. We’re already pretty close to 125 per second, so we know how many pixels we need to produce. We also have weight-tracing technologies that in a year or two will be able to pump that out. After that, we can just focus on perfecting perfection and making them even more beautiful.

 

 

We’re already doing enormously interesting things like video see-through technologies. But we also need to be able to capture real-life 3D environments that people are actually in. Right now we’re at the beginning of that.

The first Varjo devices are professional devices that will be used in limited spaces. So we only need to capture small, cube-shaped environments. But fast forward just a few years and prosumers and consumers will want to be wearing these devices untethered and outside in the real world, so then the situation becomes a million times more complex.

When that happens, all the core hardware tech will change, but so will everything else. If you’re walking around the city, you’ll have a completely uncontrolled environment with tons more photons, so capturing that in real-time is going to be a huge challenge. Ultimately this is what we’re working towards and that’s beyond exciting.

“We build fast and we’re growing fast. That means you have to learn fast.”

I work mostly on gaze tracking and gaze prediction. The hardware we have is a foveated rendering architecture which operates on the idea that we do really fast, really accurate gaze tracking. So we know exactly where you’re looking and then we optimize the image based on this information.

But that also requires that we’re able to predict where you’re going to look next. This needs to be done super fast and very exactly. And that’s something that’s never been done properly.

Of course it’s not just about gaze tracking. We also need to track your hands, the people around you, and basically everything. So if you want to come to work here there are tons of tracking problems to solve, all of which require distinct solutions.

“This isn’t a hardware company - this is a hardware and a software company.”

One of the most interesting things about working at Varjo is that this is not a hardware company – this is a hardware and a software company. We build both. And that means that any problem we need to solve, we have two separate angles of attack. We can tackle it with existing, modified or new hardware, software, or both.

For us, software and hardware development happens in parallel. And with 3D printing and other rapid prototyping technologies, we can iterate through multiple new versions of hardware every day.

I don’t personally have any real mechanical or optical or a real engineering background, but I still work quite closely on developing the whole hardware side of the gaze-tracking solution.

Many of the programmers here have been working in gaming with engines like Unity® and Unreal®. We do most of our coding in C++ and C sharp, so those skills definitely have to be top-notch. And algorithm development and design talents are highly emphasized because so much of what we’re doing you can’t look up, you need to develop it yourself.

Want to change the world?

Take a look at our open computer vision jobs, or see all developer and engineering positions.

Open positions

Looking for computer vision jobs?

DON’T SEE YOUR DREAM COMPUTER VISION JOBS HERE?

If you’re passionate about tech and love to push limits, we want to know more. Send an open application to jobs@varjo.com and tell us who you are. You are also welcome to follow us on LinkedIn.

Organizations

XR-4 and XR-4 Focal Edition are available in our web store and through selected Varjo resellers to business customers in nearly 40 countries.

Individuals

Private customers can order Aero through our selected Varjo resellers. Shipping to EU, Canada, Iceland, Norway, Switzerland, United Kingdom and United States.

Organizations

Business customers have access to our full product range.

Individuals

Private customers can order Aero through our selected Varjo resellers.