With OpenBCI’s Galea, Bayerlein was able to deftly fly a drone around the TED presentation stage. Galea captures even the most minute muscle movements that can be used as control inputs, functioning in the exact same way as if a person was moving a joystick with their hand.
“Essentially, we scoured Christian’s body for residual motor function and then turned these muscles into digital buttons. Next, we turned the buttons into sliders and combined it with a joystick,” Russomanno explains.
The same brain-computer interface approach can be used for any kind of device or technology operated with digital joysticks or similar controls, opening up an entire world of possibilities for augmented and expanded control and activity. And this is just one example of the potential fields where Galea could be applied.
Galea’s software suite can turn the sensor data it captures into meaningful metrics that enable new kinds of user interaction and analysis. For example, Varjo’s high-speed eye tracking integrated into the Aero headset allows Galea to accurately measure the user’s gaze at all times.
While the TED Talk demonstrated hands-free control schemes, Galea also enables the analysis of different mental states such as stress, fatigue, cognitive workload, and focus, based on real-time data from the user’s body.
Galea is an exciting example of how wearable neurotechnology combined with cutting-edge immersive technology can accomplish, and how it can make the world a more accessible place for everyone. It is an idea best summarized by Christian Bayerlein in the talk:
“We saw what’s possible, when cutting-edge technology is combined with human curiosity and creativity. So let’s build tools that empower people, applications that break down barriers, systems that unlock a world of possibilities. I think that’s an idea worth spreading.”
The NeuroFly software toolkit has been released free and open-source so that others can use it as a starting point for their own projects. Visit: NeuroFly Toolkit | OpenBCI Documentation