Unity Examples

Here you can find examples of the simple scenes showing the main features of the Varjo headset. You can freely copy code from the examples to your own project. The example project is supplied together with the Varjo Plugin for Unity. 

Note: Make sure to copy the example to your project before modifying it. These examples are constantly updated, and any changes might be overwritten.  

 

Rendering example

Basic example for testing that the headset is working and showing an image correctly. Can be used as a starting point to understand how to develop for Varjo headset. Scripts demonstrated in this example are common for all the following examples explained in this section.

Scripts used:

  • Tracked devices
    • Varjo_SteamVR_ControllerManager
    • Varjo_SteamVR_Manager
  • Controller
    • Varjo_SteamVR_TrackedObject

Varjo_SteamVR_ControllerManager

Script for enabling/disabling objects based on connectivity and assigned roles. Ensures that the Varjo headset and controllers get assigned correct roles in VR and behave as expected.

Varjo_SteamVR_Manager

Script for handling rendering of all SteamVR_Cameras.

Varjo_SteamVR_TrackedObject

Script for controlling in-game objects with tracked devices.

 

Events example

This example shows how to reset tracking position and how to record different events in the application.

Scripts used:

  • Events
    • VarjoTrackingReset
    • VarjoEvents

VarjoTrackingReset

Script for resetting tracking origin position based on current user location by pressing keyboard button. Could be useful for seated experiences, where user’s position is stationary. Settings for the example could be found in the inspector:

VarjoEvents

Script for logging events, shows how to log info about Application and System buttons on Varjo Headset being pressed. Note that you can only use Application button on your Varjo Headset for application, as System button is reserved. However you still can detect when the System button was pressed.

 

Interactive example

This example shows how to do input and move a player around with basic teleportation and pick up scripts.

Scripts used:

  • Controller
    • VarjoTeleport
    • VarjoPickup

VarjoTeleport

Basic teleport script by holding the touchpad of the controller. Teleportation is used for moving the user/player around the scene, without causing motion sickness.

VarjoPickup

Throwing script by holding the controller trigger and releasing when throwing. Used for interacting with objects, picking up objects using controllers and throwing them with a throwing motion of the hand.

 

Eye tracking example

If you are not familiar with the eye tracking capabilities of the Varjo headset, it is recommended that you familiarize yourself with Eye Tracking with Varjo headset before proceeding. In order for eye tracking to work, please make sure you have enabled eye tracking in VarjoBase as discribed in Developing with 20/20 Eye Tracker section.

This example shows how to request eye tracking calibration and use the 20/20 Eye Tracker. Additionally, it shows how to poll button events from the headset as calibration is requested by pressing the Application Button on the headset. The Application button in the Varjo headset is meant for interacting with VR apps. You can freely choose what this button does when using your application.

To access the Eye tracking code, check VarjoGazeRay.

This example also shows how eye tracking information can be logged into a file. Logged information includes information for frames, headset position and rotation and separate data for each eye’s pupil size, direction, and focus distance. All the information is stored in the .csv file in /Assets/logs/ folder.

To access the Gaze Logging code, check VarjoGazeLog.

An example for accessing the Varjo headset’s Application button is demonstrated in the VarjoGazeCalibrationRequest file.

Scripts used:

  • Gaze
    • VarjoGazeRay
    • VarjoGazeCalibrationRequest
    • VarjoGazeTarget
  • GazeLogger
    • VarjoGazeLog

 

VarjoGazeRay

Shoots rays to where the user is looking and sends hit events to VarjoGazeTargets. You may use it for projecting to the point the user is looking at.

Requires eye tracking calibration first.

VarjoGazeCalibrationRequest

When the user presses the Application Button or Space, an eye tracking calibration request is sent. Calibration needs to be repeated every time the headset is taken off and put back on in order to ensure the best eye tracking performance.

VarjoGazeTarget

Handles sphere color change (target) when the user looks at it. You may use it as a starting point of how to change behavior based on gaze interactions. 

VarjoGazeLog

Logs gaze data after pressing a button. Requires eye tracking calibration first. Logged data is stored in the project’s folder.

Settings for logging can be seen in the Unity editor, and you can modify them according to your needs.

 

UI example

 

This example shows how to create a Unity Canvas UI for the headset. UI works by moving your head or using hand controllers. Selections can be made with a keyboard, the headset’s Application button or the hand controller trigger. Buttons, toggles and dropdowns are supported.

Scripts used:

  • VarjoEventSystem
    • VarjoInputModule
    • VarjoPointer

VarjoInputModule

A replacement for the Standalone Input Module. For use with world space canvases in VR. Handles the keyboard, headset and hand controller for the canvas. Recommended for use with VarjoPointer (see below). Requires a dummy camera for pointing. Rendering of the camera can be disabled. If VarjoPointer is used, add a dummy camera to it.

VarjoPointer

A visual representation for UI pointing. Works with both canvas and geometry. Changes input automatically to the last used or connected method.

 

MR Example

This example shows how to display video pass-through image from both cameras of XR-1 Developer Edition in headset, and all mixed reality related functionalities.

Scripts used:

  • VarjoMixeReality

VarjoMixedReality

Controls for mixed reality related functionalities.

 

MR Stream Example

This example shows how to process raw video pass-through image from cameras of XR-1 Developer Edition and trigger an event every frame.

Scripts used:

  • VarjoMixedRealityStreams

VarjoMixedRealityStreams

Controls for mixed reality stream related functionalities.