Getting Started with Varjo SDK

If your project relies on your own engine that is not yet supported by Varjo headset, you can add support for your engine using the Varjo SDK for Custom EnginesVarjo SDK documentation is intended to help you prepare your project to work on Varjo headset. Jump to the corresponding section: 

Going from another VR headset to Varjo headset  

Tracking

Migrating from OpenVR

Rendering to Varjo headset

Step by Step

Error handling 

Threading

Swap Chains

Layers

Timing

Events

Properties 

Timewarp 

Eye tracking 

 

GOING FROM ANOTHER VR HEADSET TO VARJO HEADSET 

When thinking about adapting your product for the Varjo headset, it is important to understand the difference between a “traditional” VR headset with traditional displays and the Varjo headset with the Bionic DisplayYou will need to use much higher image resolution in order to use Bionic Display to its full potentialAdditionally, you will need to render two images per eye, instead of single image, since each Bionic Display includes two overlapping displays. Details on how to render images are explained in the section Rendering to Varjo headset.

 

TRACKING 

Currently, Varjo headset uses SteamVR™ Tracking technology. API used for SteamVR Tracking is called OpenVR. 

OpenVR API can be utilizefor controllers and trackers. You will need to initialize the SteamVR system as a background application to access the controllers in your application. The example implementation of hand controllers tracking can be found in the Benchmark example.  In most cases it’s sufficient to get your existing controller implementation working by just changing the type of your vr::IVRSystem 

from vr::EVRApplicationType::VRApplication_Scene 

to vr::EVRApplicationType::VRApplication_Background.

Just remember that this IVRSystem should be used only for controllers and input, not for rendering or head pose. 

Be careful not to use the OpenVR’s headset position to render the image. This may cause problems with the Varjo compositor and result in a suboptimal VR experience. You should only use controller tracking and controller input portions of the OpenVR API. You should always render the image using the view information provided by the Varjo API. 

 

MIGRATING FROM OPENVR 

OpenVR is an SDK and API developed by Valve. If you have read SDK documentation, you are most likely familiar with how it works, and have already ported your software to work with it. Varjo headset does not use OpenVR for headset tracking. You still must use OpenVR API for controller integration in your engine, but all the headsetrelated functionalities need to be addressed through the Varjo API.  

 

RENDERING TO VARJO HEADSET 

The Varjo headset headset needs a session to work. With the session you can query information to set up your render targets and buffers.

For the main loop, Varjo API gives you a sync point that you can use to initiate your render loop. Based on previous frames, the sync point gives you the latest moment you should start rendering to hit the vertical sync of the displays with the freshest data. The purpose of this is to ensure you always render with the latest positional data, keeping the tracking latency as low as possible. High latency can easily cause nausea and break the immersion.

Use varjo_GetViewCount to query how many viewports needs to be rendered. If you get 4 it means that 2 viewports for background displays and two fo focus.

To query information (width, height) about each viewport use varjo_GetViewDescription. Projection matrix has to be created with varjo_GetAlignedView.

Once the viewports have been rendered, you should submit the render targets to the Varjo system. Now the Varjo compositor can draw the images to the headset, applying distortion and color correction as well as time warp to the image. 

Varjo Base has a simulator that you can use to display compositor displays as a desktop window. Refer to the Developing Without Varjo headset page for learning how to enable it. Do note that nothing is rendered to the headset if the compositor window is enabled in software version 1.1The Version 1.2 update will allow the possibility to mirror an image from the Varjo headset. 

It’s advisable to begin development with the compositor window when you start integrating Varjo SDK for Custom Engines to your engine. Once you get the viewport images on the compositor window, you can verify the functionality in the real headset. Using the compositor window can speed up the development as you don’t need to constantly put on and take off the headset. 

 

STEP BY STEP 

This section explains how to show an image on Varjo headset using DirectX or OpenGL. Developing DirectX and OpenGL support is very similar, the flow of the development process is described below.

  1. Initialize the Varjo system with varjo_SessionInit
  2. Query how many viewports needs to be rendered varjo_GetViewCount and setup viewports using info returned by varjo_GetViewDescription
  3. Create as many swapchains as needed via varjo_D3D11CreateSwapChain or varjo_GLCreateSwapChain. If you are planning to use one texture which will contain all viewports, one swapchain is enough. If you need separate textures for a viewport then create as many swapchains as there are viewports (varjo_GetViewCount)
  4. Enumerate swapchain textures varjo_GetSwapChainImage and create render targets
  5. Create frame info for per-frame data with varjo_CreateFrameInfo
  6. In your main loop
    1. Process your frame logic and after that, call varjo_WaitSync. This will fill in the varjo_FrameInfo structure with the latest pose data a. Create projection matrix with the help of varjo_GetAlignedView for each viewport (can be reused in following frames) or b. Use provided projection matrix from varjo_FrameInfo and modify clipping planes to suit your needs.
    2. Begin rendering the frame by calling varjo_BeginFrameWithLayers
    3. For each viewport
      1. Render the frame
    4. Submit textures with varjo_EndFrameWithLayers. This tells Varjo Runtime that it can now draw the submitted frame buffer to the headset
  7. Free the allocated varjo_FrameInfo structure
  8. Shut down the session by calling varjo_sessionShutDown

You can see an example implementation on the Examples page.

 

ERROR HANDLING 

You can query Varjo errors with the varjo_GetError function. The error code refers to the first frame loop that has failed. The following API calls may fail as a cascading result, without overriding the previous error. The varjo_GetError function would also clear all the following errors. Errors are important and must be checked at least once every frame. Errors can be used for informing the user about errors.  

 

THREADING 

Varjo API is thread safe. 

However, graphics APIs put additional constraints on this as, e.g., Direct3D calls will use the immediate context of the provided graphics device and they must not overlap with threads that use the same context. 

 

SWAP CHAINS 

Using swapchain is the only way to submit frames to compositor.

To create swapchain developer should use varjo_D3D11CreateSwapchain (for DirectX 11) and varjo_GLCreateSwapchain (for OpenGL).

In order to use swapchain, you would need to do the following:

  • Call varjo_D3D11CreateSwapchain or varjo_GLCreateSwapchain
  • Enumerate swapchain images with varjo_GetSwapChainImage to create render targets

For each frame:

int32_t swapChainIndex = 0;
varjo_AcquireSwapChainImage(session, &swapChainIndex);

Render into created render target with given index

render->render(frameInfo, swapChainIndex, donutObjects, controllerObjects);

Release swapchain image varjo_ReleaseSwapChainImage

To destroy swapchains use varjo_FreeSwapChain

 

LAYERS

SDK provides one layer type (will be extended in the future) – multi projection layer.

struct varjo_LayerMultiProj {
	struct varjo_LayerHeader header;
	varjo_Space space;
	int32_t viewCount;
	struct varjo_LayerMultiProjView* views;
};

varjo_LayerHeader contains flags which can change how layers are blended. varjo_LayerMultiProjView describes the texture of each view.

struct varjo_LayerMultiProjView {
	struct varjo_ViewExtension* extension;
	struct varjo_Matrix projection;
	struct varjo_Matrix view;
	struct varjo_SwapChainViewport viewport;
};

Also view can have an extension, one example is varjo_ViewExtensionDepth:

struct varjo_ViewExtensionDepth {
	struct varjo_ViewExtension header;
	double minDepth;
	double maxDepth;
	double nearZ;
	double farZ;
	struct varjo_SwapChainViewport viewport;
};

Attaching depth buffer to a view will enable better time warp and depth testing when mixed reality is turned on.

Full example code can be found in the Benchmark example.

 

TIMING 

Varjo uses nanoseconds as a time unit. Absolute times are relative to an epoch which is constant during execution of the program. Time can be queried using the varjo_GetCurrentTime function. 

To query the time for a frame, the varjo_FrameGetDisplayTime function will return the time of average perceived moment of when the image is shown. 

Because measuring time in nanoseconds yields very large numbers, you should be aware of possible precision issues when casting to other types. 

 

EVENTS 

Varjo API uses events to notify users about changes to the system and user input. 

  • Initialize a varjo_Event structure. It can be allocated in stack or if you want it allocated in heap, you can use the varjo_AllocateEvent helper function. 
  • Poll for the events in your main loop or any other place that gets called frequently by calling varjo_PollEvent in a loop. 

Varjo_events.h contains all available event types. 

 

PROPERTIES 

Properties can be used to query different Varjo system values. 

  • Call varjo_SyncProperties when you want to update the properties. 
  • Use varjo_HasProperty to check whether a specific property value exists. Each property has its own varjo_PropertyKey. 
  • Use varjo_GetProperty* to get the actual property value. 

All available properties are listed in Varjo_types.h. Currently, there are only gaze trackingrelated properties. 

TIMEWARP 

Timewarp on Varjo headset is a way of making sure that the rotation of the headset in VR and in real life is always in sync. The plane of the image is being rotated if the headset rotation is changed regardless of the headset’s image frame rate. This solution allows you to make sure user would never feel nauseated looking around while wearing the Varjo headset. Varjo’s timewarp only works with rotational movement and would not help reduce stuttering with positional movement or animation. This feature is working “out of the box”, you don’t need to do anything extra to enable it. 

 

20/20 EYE TRACKING 

You can use the full capability of the Varjo headset eye tracking feature in your software. Get more familiar with eye tracking on the Eye tracking page. API endpoints for eye tracking can be found in the API endpoint guide available in Varjo SDK for Custom Engines package under varjo-sdk/docs.