Calling all academic institutions: Special offer on the Varjo XR-4 Series available until Nov 30. Apply now.

Talk to sales

How to create hyper-realistic avatars in virtual reality and mixed reality?

September 21, 2022
by Basil Trunov, Unreal Engine Developer and Relations Engineer, Varjo
|
Design

The concept of a digital human in cyberspace, also known as an avatar, controlled by an operator wearing a virtual reality headset, was predicted a long time ago. It originated from science fiction books at the end of the 1960s and became especially popular at the dawn of personal computing during the 1980s and 1990s.

Thanks to the development of extended reality technologies, we are now approaching a new era of digital presence and immersive interactions in the metaverse. This has made the topic of human representation in cyberspace more important than ever before.

Use of virtual reality avatars is growing rapidly

Various companies, content creators, and evangelists in the extended reality industry are implementing early prototypes of avatars in their digital content with the goal of enabling better interactions between humans in the metaverse.

In a relatively short period of time, these metahuman avatars have entered different fields such as virtual production, AAA games, and VR blogging, and more and more other industries, too, are expected to start using them soon.

With the rise of applications such as VRChat, YouTube real-time VR blogging, or specialized multi-user design review applications, the demand for realistic avatars is greater than ever before. A great example of VR blogging using real-time metahuman avatars is the Xanadu YouTube show.

But what kind of characteristics and capabilities enable you to create a great, realistic-looking avatar?

Characteristics of a great virtual reality avatar

A great avatar is as realistic-looking as possible. The greatest benefit of realistic avatars is that they enable far deeper, more immersive VR experiences than the (currently) more widely used, cartoonish avatars.

Being able to portray small details such as the kinematics of emotions (how a person’s face moves when portraying a specific emotion), or imperfections that mimic a real person’s facial anatomy, create a feeling of another person’s presence that is very close to real life. This is something that the simplified, cartoonish characters can’t accomplish, even when used in a 3D environment.

The realistic-looking avatars provide a more personal, intimate, and hyper-realistic experience when having a virtual conversation or other interactions in the metaverse, whether you are using social applications such as VRChat, or conducting professional design reviews with a team from around the world.

It is important to remember that the journey of the virtual reality avatar industry is just beginning. It goes without saying that there are still a lot of technical limitations that will need to be resolved for virtual avatars to feel completely realistic, but some of the early developments do look extremely promising.

realistic avatars in virtual reality

Practical example: Epic Games MetaHuman Creator and Unreal Engine 5

The Epic Games MetaHuman avatars, which are created using Unreal Engine 5, are a perfect example of proper implementation of a digital human concept. The high-level visual quality they provide is quite unlike anything we have seen before. The power of this technology was shown during the launch of Unreal Engine 5 in the Matrix Experience.

Epic Games has also provided a very intuitive way of creating MetaHumans via their cloud-based MetaHuman Creator. Using MetaHuman Creator via the cloud is as easy and user-friendly as creating your character in a computer game. The creation tool runs smoothly on a cloud in real-time and supports features such as raytracing that provides a high-fidelity visual representation of the created character. The character can then be exported into Unreal Engine 5 for further adjustments.

The MetaHuman Creator supports all the new features of Unreal Engine 5 such as control-rig, space-switching, pose-tool, facial pose IK retargeter, and IK Rig, allowing you to use custom motion capture animations and real-time face capture via a live-link plugin.

You can find more information about these features in Unreal Engine 5 documentation linked below.

How Varjo enables deeper interactions in the metaverse with photorealistic avatars

Here at Varjo, we have developed our own VR/XR demo using the latest features of MetaHuman Creator. Thanks to Unreal Engine 5, it is possible to create extremely lifelike MetaHuman characters easily.

By using Varjo’s top-tier headset, which features high-fidelity, colored, low-latency mixed reality video pass-through, we are able to bring the sensation of a virtual character’s presence in an actual room to the next level. With Varjo, it is now possible to see all the small details of a virtual character such as cloth fabric, eyes, skin, and hair, and even experience facial animations that are captured in real-time via Unreal Engine 5’s live-link (plugin) capturing system. The virtual avatars can even be streamed directly from Varjo Reality Cloud.

This is an ongoing, work-in-progress experiment to see where avatar technology may lead us in the future. We are still actively developing and adding new features to the experiment. In the future, capabilities such as supporting a full-body motion tracking system may give us a unique opportunity to virtually embody the avatar and take full control over it – while everything is rendered in real-time in mixed reality, with human eye resolution provided by the Varjo  headset.

This marks the first step into the cyberspace of the metaverse in extremely high resolution using hyper-realistic MetaHuman avatars – exactly the way it was predicted in those cyberpunk sci-fi books of the past decades.

Creating digital twins of real-life people

Speaking of recent developments, it is worth mentioning that Epic Games has released a new, updated version of the MetaHuman plugin. The new version includes the Mesh to MetaHuman feature that allows you to upload a 3D-scanned mesh of a real face into MetaHuman Creator. This enables us to create more accurate and realistic digital twins than ever before.

This very feature was used in a magnificent way by researchers to reconstruct the face of a 10,000-year-old shaman.

The future of MetaHumans and photorealistic avatars

As you can see in the examples in this post, the metahuman avatar technology is developing rapidly. As people start taking their first steps in different metaverses, realistic and constantly improving avatars will be key elements in making these environments feel more real and full of meaningful interaction.

Even though the journey is just beginning, the future already looks bright for human interaction in the metaverse.

Browse more Varjo Insider posts

Talk to sales