Interaction-Aware Avatar Animation and Rendering


In the ever-evolving landscape of Extended Reality (XR), the SHARESPACE project stands as a pioneering endeavor, focused on creating immersive and interactive embodied experiences for users. “Interaction-Aware Avatar Animation and Rendering” is the fourth work package of the project. This pivotal work package comprises a series of activities aimed at pushing the boundaries of avatar technology and rendering techniques.

At the heart of any embodied XR experience is the avatar, a digital representation of oneself in the virtual world. SHARESPACE’s first undertaking is to redefine what avatars can be. These avatars are not just visual representations; they are interactive, adaptive, and data driven. By creating a comprehensive definition, SHARESPACE ensures that avatars are not mere spectators but active participants in the XR environment. Find more information about SHARESPACE avatars here.

Animating avatars traditionally involves painstaking keyframing or motion capture. SHARESPACE’s approach is data-driven; the motion of individuals is captured and used to animate their avatars. Additionally, Machine learning algorithms will be used to analyze human movement to create realistic animations. This results in avatars that move and interact with users in a more natural and dynamic manner, elevating immersion to new heights.

One of the challenges in XR is maintaining the distinguishing characteristics of how users move, regardless of their unique appearance. SHARESPACE tackles this challenge by employing advanced kinematics mapping techniques. This ensures that the way avatars move is not only lifelike but also adheres to the users’ personality and way they move.

The XR environment is inherently social. People interact with avatars and other users, making real-time adaptability crucial. SHARESPACE’s avatars are designed to seamlessly adapt to different XR environments. Whether it’s a casual conversation in a virtual café or bicycling in a real-world road, these avatars respond appropriately, enhancing the social aspect of XR experiences.

To truly transport users into immersive XR worlds, SHARESPACE leverages state-of-the-art neural rendering techniques. These techniques take input images from the real-world and can generate views from different points of view enabling the generation of photorealistic scenes thus making the virtual environment more believable. As avatars interact within these scenes, the boundary between virtual and real blurs, resulting in an unparalleled sense of presence.

The “Interaction-Aware Avatar Animation and Rendering” work package within the SHARESPACE project represents a significant leap forward in the realm of XR. By defining avatars as interactive entities, employing data-driven animation, preserving individual users’ style, enabling social adaptation, and harnessing neural rendering, SHARESPACE is poised to revolutionize how we perceive and interact with others within real and virtual worlds.