Calling all academic institutions: Special offer on the Varjo XR-4 Series available until Nov 30. Apply now.

Talk to sales

Combining VR and Neurotechnology with OpenBCI’s Galea

November 15, 2022
by Joseph Artuso, Chief Commercial Officer, OpenBCI, Kritiksha Sharma, Communications Specialist, OpenBCI
|
Medical Research

Who are we? 

Neurotechnology has taken the world by storm in the past decade. This combination of neuroscience and technology has proved numerous real world applications in its power to decode and augment brain activity. Neural interface technology has the potential to revolutionize how we interact with each other and with our increasingly digital world. For nearly a decade, OpenBCI has been at the forefront of expanding consumer access to neurotechnology. Our open-source hardware and software has been a starting point for the new generation of neuroscientists and tech entrepreneurs. Our products like the Cyton board, Ganglion, and Ultracortex have been serving neurotech enthusiasts’ research and interests since 2014. 

Our latest product, Galea, fully-integrated with the Varjo Aero headset, was built to be the bridge between virtual reality and neurotechnology.  

What led us to build Galea? 

With OpenBCI, what started as a movement among our initial customer base of makers and early-adopters has now grown into a global community of scientists, developers, entrepreneurs, and innovation teams within major tech companies.

We’ve had a unique vantage point on our industry’s evolution and we’ve paid close attention to the following trends:

  • AR/VR has become a powerful tool for neuroscience and other experimental research
  • Major technology companies are increasingly exploring physiological sensors for user testing and as user inputs for AR/VR

Neuroscience, spatial computing, and consumer technology are converging and we’ve watched our existing customers increasingly combine our previous products with head-mounted displays (HMDs). To add to our insights about these more gradual industry-wide developments, our users had been requesting integration with HMDs, and a combination of multiple sensors for some time. 

Our Latest Offering 

So, we created Galea to be a powerful development kit for teams looking to explore the intersection of spatial computing and neurotechnology. 

Galea neurotechnology headset

Galea Tech Specs: What kind of data does it collect? And how? 

Galea’s engineering is designed to enable high-quality data collection from multiple parts of a user’s nervous system. The device combines sensor networks for EEG, EOG, EMG, EDA, PPG (see chart below) designed by OpenBCI, with the high-quality eye tracking of the Varjo Aero. All sensors work under a single hardware clock and as a result, Galea is able to dramatically simplify the process of collecting synchronized data from the body. This functions as an incredible tool for anybody looking to objectively measure user experiences and cognitive states.

An overview of the different hardware sensors we’ve packed into Galea:

Galea Sensor Hardware 
VR Headset Fully-integrated with Varjo Aero
Scalp EEG 8x Dry, soft-polymer, active electrodes
Galea Facepad EMG – 4 channels

EEG – 2 channels, passive

EOG – 2 channels

EDA – 1 forehead sensor

PPG – 1 forehead sensor

Image-based eye tracking Via Varjo Aero – 200 Hz with sub-degree accuracy; 1-dot calibration for foveated rendering

The combination of sensors and their arrangement across the scalp and facepad enable researchers to gather data that can classify a diverse range of internal states. Some of these include the user’s attention, cognitive workload, facial expressions, emotional arousal, stress levels, physical performance and so much more.

Galea sensor cheat sheet

By combining Galea’s multi-modal sensor system, integrated software and Varjo’s VR hardware, users are equipped with a robust tool to help accelerate innovations across industries. 

What can you build with Galea?

The OpenBCI team really enjoys finding new ways to turn science fiction into science with Galea. Our team in the New York office frequently bands together to brainstorm building new applications that help us discover more and more of Galea’s powers. Here are just a few of the many things we’ve enjoyed building with Galea.

New hands-free control schemes & inputs

Imagine a hands-free Temple Runner. One where you can navigate your player with just a smirk. That is exactly what we built with Galea this past month. Our CEO, Conor, put on the Galea headset to play the game Cat Runner and navigated his runner simply by twitching his left and right cheek muscles. Watch the whole demo video below.

This demo is a simple example of how EMG signals from parts of the face can be implemented as an additional set of user inputs for VR. These muscle and eye movements augment the functionality of your hands and provide the user with extra buttons.

Another hands-free EMG plus eye tracking gaming application that the OpenBCI team is currently working on is our VR take on an old game called Pipe Dream. Our director of software, Philip, designed a pipe building game that is navigable purely with the player’s facial movements. An eyebrow raise makes the pipe go up while an eyebrow furrow makes it go down. A quick staredown with the pipe can place it on the grid and a flutter of the eyelids can resume your old game – all of this can be made possible with Galea’s EMG sensors.

Facial muscle movements, combined with eye tracking are used for new forms of user inputs.

Galea is able to detect the slightest muscle tension in the face with the EMG sensors and translate that movement into a gaming action. The EMG facial classifiers in the facepad receive electrical signals or spikes in the electrical activity of the muscle contractions to complete the navigation action.

Neurofeedback VR applications

Fancy a room custom built to match your brain’s mood? The Synesthesia room is a VR playground where the environment’s audio and visuals are dynamically generated in real-time based on the user’s EEG brain activity. Segregated into 5 main EEG frequency bands, the room’s audio and visuals change depending on the relative strength of each power band.

This is what it looks like when sensory input from a user’s brain is able to control and modify the VR space’s colors

Designed as a neurofeedback VR application, the Synesthesia room shows how Galea’s data and Unity SDK can be used to create VR experiences that provide users with feedback on internal states like cognitive workload, stress, attention, etc. that they may not otherwise have the ability to quantifiably perceive. Research has shown that by providing this feedback, people can more quickly learn how to control or modulate these internal states.

How to learn more about Galea?

We’re looking forward to sending Galea out into the world and seeing all the cool new applications that other users are able to build with this. We see Galea as a dynamic next generation research platform for businesses and developers who are looking to explore the power of combining biosensing with VR.

We’ve already received a significant number of pre-orders from companies and researchers in consumer technology, healthcare, gaming, and interactive media. As the global interest for Galea grows, we’re nearing the end of our pre-order process, having closed Batch #3 just recently on October 31st. For more detail and pre-order information on Galea, visit galea.co.

To stay updated on all the latest happenings, events, and news related to Galea or OpenBCI sign up for our monthly newsletter here.