AWE 2021 thoughts and takeaways
Rémi Arnaud, Principal Architect, has been working at Varjo since early 2019 and is a keen attendee of the AWE (Augmented World Expo). Below are his personal thoughts on the 2021 rendition of the event.
Back in person
AWE 2021 was a real location – Santa Clara, CA – at a specific point in time – Nov 9-11 – with people in the flesh. This used to sound quite ordinary, but after almost two years without a trade show, this was extraordinary. What a privilege to have been here amongst enthusiasts that were brave enough to follow the rules – everybody had to show proof of vaccination and wear a mask – to participate in this XR celebrating event.
There is an interesting paradigm about XR, that it’s only possible to experience it in person. There is no amount of YouTube or Zoom or VRchat that can provide the experience of high-resolution Varjo HMD, the smell of OVR olfactory add-on or HaptX tactile gloves. XR is personal, and even though a lot of time it is touted as the best technology for doing things remotely, there is no replacement for an in-person trade show to try it out. Niantic, who was a main sponsor of AWE, pushed the in-person value further by announcing their SDK to enable AR applications to entice users to get outside in the world, and interact with real people, as opposed to being immersed in a pure virtual environment replaced by synthetic avatars. John Hanke took a few stabs at Facebook, who was not at the AWE conference yet again, enticing developers to make a choice in what future they want to build!
The conference happened at a unique time when it felt like the rest of the world was shining a light on the XR community. 11 years after its first instance AWE was at the center of the next technological revolution, and all the innovations that were often thoughts as anecdotal suddenly are the core technology for the next internet. All of us at AWE, exhibitors, speakers, and the audience were part of this special moment, it felt good to be part of the XR community.
The M word
The M word. There was no avoiding it, it was on everybody’s mouth. It came directly out of Neal Stephenson’s novel and infused the AWE community. It was the subject of many talks, including those at NVIDIA’s GTC that was happening in virtual at the same time as in-person AWE. It was made a world upgrading technology when Facebook changed its name and re-focused tens of thousands of employees. It surrounded and penetrated AWE. “Was this what we have been doing all along?” was wondering the XR community.
My long-time friend Avi Bar Zeev, we’ve been in the virtual earth business a while back, had a very thoughtful and humoristic talk at AWE, he talked about all the possible Verses. Not only the M*Verse, but all the existing and futuristic *Verses and how to organize them. Here is one slide worth spending some time reading.
One thing that seems to be agreed upon at AWE is that metaverse is singular, not plural. It is like ‘the internet’. In fact, the metaverse is probably better described as the next version of internet, or the next version of the web to be more exact. In the metaverse however, everything would be able to interact with the user and could be transported from place to place. Instead of videos and html pages the content will be 3-dimensional, with physical rendering and mechanical properties. Like the internet it will be sharable, but much better than the internet as the metaverse will provide shared experiences. There is a large consensus, at least clearly voiced by the AWE community, that the metaverse should be open and standard.
However, unlike the internet, there are no official public governance and guaranty that all multiverses all the various companies are promising will turn into a metaverse. How does one take an asset from one company *verse to another company? I guess we all have to be very aware of this. Magically another piece of technology is coming to the rescue. This other technology comes from the world of finance and security but seems to be maturing at the right time. It’s a very fortunate convergence, the world of cryptocurrency and NFT is heading on a collision course with the XR community, and this may very well end up creating new technologies, not unlike in an accelerator where new particles are discovered. A fun fact is the inventor of this electronic currency is known as Satoshi Nakamoto, and is in fact a virtual person, as we are not sure who he is. Predestined to be part of the metaverse.
The metaverse does not imply that one must use XR technology, it could be observed and interacted with using a phone, a web browser or any other device that can provide a view into the metaverse. But one other big expectation about the metaverse is that it will allow content creation. It’s not a place where passive observers can query about something and observe the result, but it’s a place where everybody can contribute and create content, share content, and maybe more importantly collaborate to create content. So for sure, XR is the best technology to interface with the metaverse, and human-eye resolution, hand tracking, eye tracking will be essential to provide the best access to the metaverse. Just like it is possible to watch a movie, highly compressed on a small low-resolution phone screen, but certainly not by choice.
Web browsers and cloud computing and have been around for a while now, but the metaverse is asking for a complete revamp of the technology. Game engines are now a core piece of technology as they are needed to render and animate all the metaverse objects. GPUs and CPUs are under pressure to provide more performance than ever. Network bandwidth and even more important latency need to be improved. AI have to be everywhere to provide life within the metaverse and help users to navigate. There is no shortage of technology that needs to be improved or invented under the umbrella of building the metaverse.
To end this article, I want to go back to reality as mentioned in the beginning. Even if there are cases where isolation from the real world and all the people is a perfectly valid use case, in general, real life and real people have to be part of the metaverse and interface with it in a natural way. I am not convinced that everyone wants to be represented by an avatar, and that the world around us benefits a lot from being replaced by a simplified version. However, the real world has to be faithfully represented and interacted with just like virtual objects in the metaverse. We humans perceive the real world through a set of sensors, and IMHO the metaverse has a chance to provide us with superpowers to observe the real world. Using a Varjo high-end pass-through XR-3 HMD, which focuses on quality and very low latency, I can already apply real-time shaders to the camera image and enhance my vision, or the opposite to enable training in different lighting conditions. I can add shadows to the real world to blend virtual objects into the real world, or I can add reflections of the real world onto the surface of virtual objects. This is today, so let’s together imagine and improve over time how to enable the best metaverse, where reality and virtual reality are not discernable.
AWE is inspiring!