Experiencing Dementia Through Mixed Reality in Cycle 7
What does it mean to forget yourself? To inhabit a space where memory and reality intertwine, where the line between present and past is blurred by age and illness?
Cycle 7 is a mixed reality theatrical experience that offers a rare and deeply human glimpse into the world of someone living with dementia. Using the Varjo XR-3 headset and a seamless fusion of physical performance, immersive environments, and emotional storytelling, Cycle 7 places the audience into the lived experience of an elderly resident nearing the end of their seventh life cycle in a futuristic care center.
We spoke with the project’s creator, Matthew Soson, about the vision, technical challenges, and emotional breakthroughs behind this singular performance.

Please introduce yourself and the Cycle 7 project.
I’m Matthew Soson, an interdisciplinary artist exploring the intersection of performance, technology, and embodiment. My work investigates the fluidity of reality and the human form, with the aim of fostering greater empathy through immersive storytelling. Cycle 7 emerged from an earlier research initiative, ASURPPS, which focused on the symbiotic relationship between liveness, intimacy, and community in virtually-mediated performance.
Cycle 7 places a single audience member within the experience of an elderly resident suffering dementia near the end of their seventh lifecycle in a futuristic senior living center. Wearing a headset with passthrough cameras and earbuds to assist their failing sight and hearing, the audience navigates a narrative landscape where memories and reality intermingle. As objects and exercises trigger recollections of past life cycles, the boundaries between present reality, memory, and fantasy dissolve.
The technical concept was born from a desire to create an experience for audiences with no discernibility between the physical and digital. When the base reality is established as real through physical interaction and improvised dialogue with performers, audiences gain a concrete framework for understanding what “real” looks, feels, and sounds like. When VR memories and AR images match that “reality framework,” their ability to discern between the two breaks down, creating unprecedented depths of immersion into the narrative.
What were your primary technical challenges in integrating mixed reality into live theater, and how did you overcome them?
The most significant challenge was achieving visual cohesion between passthrough camera feeds and digital environments. The XR-3’s adjustable chroma key function greatly facilitated integrating physical performers into digital sets, without which the work would not have been possible.
Another major challenge was developing a control system that could direct both digital elements (environments, audio, animations) and physical components (lighting and sound) simultaneously. We created a custom OSC-based cueing system that allowed synchronized control across platforms.
Perhaps most challenging was the onboarding process—how to transition audiences from everyday reality into our mixed reality narrative. We developed a multi-stage approach: a pre-lobby where participants donned robes and slippers, a lobby with in-world artifacts, a transitional hallway, and finally, a ritual application of the headset incorporated into the narrative itself. This gradual transition proved essential for preparing audiences psychologically and somatically for the experience to come.

How did using Varjo’s HMDs specifically contribute to the project?
The Varjo XR-3 headset was foundational to our vision of seamless reality-blending. Its high-resolution stereoscopic passthrough cameras matched how humans naturally see, allowing audiences to maintain their sense of embodiment while transitioning between realities.
A huge advantage of the Varjo over other systems was manipulation of the shaders used to render the passthrough image. By creating a shader that added posterization and edge-finding on top of the chroma keying, we could match the aesthetic in the post-processing for digital assets and create complete cohesion.
Most surprisingly, the headset’s physical presence became dramaturgically meaningful. By acknowledging the device within the narrative as assistive technology for an elderly character, we transformed what could have been a distraction into an integral part of the experience.
Can you describe the custom shaders and novel workflows you developed?
One key technical innovation was a custom HLSL shader for the Varjo XR-3 that included edge detection for better integration with Unity environments. This reduced the overall colorspace and emphasized outlines, helping us unify physical and digital elements.
We also developed a custom OSC-based control system in Touchdesigner that communicated with Unity, lighting controllers, and audio systems simultaneously. Our workflow included a pipeline for capturing volumetric performances with the Mantis Vision system and integrating them in real time.
Our lighting design involved DMX-controlled lights synchronized with virtual lighting through our OSC system—crucial for maintaining visual continuity.


How did you approach combining live performers, volumetric captures, and virtual environments?
We established visual consistency through lighting calibration and a custom 360-degree chroma room integrated into the set. Performers were trained in Moment Work and Viewpoints techniques to interact fluidly with both audience members and virtual elements.
Touch became a grounding element—when audience members could physically feel a performer touching them while simultaneously seeing that interaction through passthrough, it reinforced the reality of what they were experiencing.
Improvisation helped us adapt dynamically to audience behavior, maintaining narrative cohesion even when technical elements behaved unexpectedly.
What key learnings would you share with others working on similar immersive experiences?
Start small, perfect your fundamental techniques, and expand methodically rather than attempting to realize an ambitious vision all at once.
Prioritize human connection and embodiment before spectacle. Our research showed that physical touch had the highest impact—more than any technical effect. Similarly, the gradual onboarding process through wardrobe changes and ritualistic interactions fundamentally shaped audience reception.
Acknowledge the technology within your narrative. Incorporating the headset as an assistive device turned a potential barrier into a narrative asset. Visual cohesion also proved more important than visual fidelity.


What do you think this project reveals about the future of mixed reality in theater and storytelling?
Cycle 7 points toward a future where the boundaries between audience and performer become increasingly fluid. Rather than passive consumption, mixed reality theater invites active co-creation through embodied participation.
I believe we’re moving toward a practice that centers human experience rather than technical spectacle. The future lies in what I call “reality expansion” rather than “technorealism”—enabling audiences to experience the fluidity of reality, perception, and identity rather than perfectly simulating physical reality.
Mixed reality theater may also serve as a safe space to explore emerging technologies and philosophical ideas—helping us process complex themes like consciousness and perception.
How did audiences respond to Cycle 7?
Audiences reported feeling simultaneously immersed and overwhelmed. Initial disorientation typically gave way to play and integration. The use of touch scored 4.89 out of 5 in our research study for increasing presence and lowering discernibility—higher than any other practice.
Even costuming audience members with robes and slippers proved remarkably effective in shifting identity and preparing them for the experience.
Perhaps most importantly, the felt presence created by the experience outweighed the need for perfect realism. Mixed reality doesn’t need to trick the eye—it needs to touch the heart.
Learn More
To explore visuals, behind-the-scenes insights, and documentation from Cycle 7, visit the official project page at mattsoson.com/cycle-7. Whether you’re an artist, technologist, or immersive experience designer, it’s a powerful glimpse into how mixed reality can deepen empathy and storytelling in unexpected ways.