“What we do is controlled chaos.”
Senior graphics developer since September 2017
“What we do is controlled chaos.”
Read our Senior Graphics Developer's thoughts about working at Varjo, or see all open VR jobs.
My background is in gaming. Before I came to Varjo I was writing a lot of 3D algorithms and system software, and I created the world’s first real-time raytracer.
I was a math prodigy when I was a kid. When my family got a computer in 1986 (a 286 EGA) I basically hoarded it and started coding.
First I coded games and then graphics. I got bored with BASIC in about six months and switched to Pascal, which was the most awesome inline assembler. After that I basically spent every night coding.
I have a long history with demoscening and I became pretty famous in the mid-90s for the real-time raytracing I was doing. Of course nowadays you can easily write all the same stuff with Shadertoy, but back then it was like doing the impossible.
At first coding was a hobby that I did outside my “boring” day job. But in 2004 I started coding mobile games at Mr. Goodliving, where I was the lead Middleware engine coder. After that I co-founded the Helsinki-based game studio Grand Cru. We launched The Supernauts in 2014 and it got over a million downloads in the first week.
“Tech-minded people want to create awesome things that have never been done before. People like that are drawn to us. And we’re drawn to them.”
Our device is a super high-end, leading-edge immersive computing device, so the technical solutions we’re working on are really demanding.
I’m working on the VR compositor, which receives the image data of a linear projection from an app and transforms it so that it will work on a VR headset.
With “normal” VR headsets that would mean correcting distortion and reducing latency by re-projecting the image to the current orientation. But the Varjo headset is way more complicated, so achieving the desired illusion with the compositor is way more difficult. For example the colors have to be exactly right or else the edges of the screen will show.
Normally the human eye is really forgiving, but not when the point of comparison is right next to what you’re looking at.
“We move forward fast and when we make mistakes we correct them fast.”
There are also big challenges with latency because screens refresh at different speeds. And the refresh rates are different because of the difference in screen size.
It’s really important to take the exact point at which the screen refreshes into consideration in VR, because if you don’t, straight lines are distorted as the user moves his or her head – kind of like what used to happen with old mobile phone cameras.
To make all of these things work really well you need physics, but you also a deep understanding and creative approaches to how vision works. And then you have to figure out how to easily and accurately calibrate them all.
What’s particularly cool here is that I get to work with a device that does something that no other device in existence is capable of doing. So there’s a big sense of accomplishment when my code directly performs something unique that hasn’t existed before and I can see and feel its immersive effect right there.
Want to change the world?
Take a look at our VR jobs for developers or view all open positions.Open positions