Codec Avatars: Immersive Telepresence with Lifelike Avatars | Meta
Codec Avatars is a technology for metric telepresence that enables immersive social presence indistinguishable from reality. This innovative technology allows people in different locations to interact using eye contact, subtle shifts in expression, posture, and gesture, creating a fully embodied and natural form of remote communication.
What are Codec Avatars?
Codec Avatars enable individuals to engage in interactions that closely mimic face-to-face communication, irrespective of their physical distance. By capturing nuances such as facial expressions, body language, and gestures, Codec Avatars offer a lifelike representation of users in virtual environments.
Why open source?
The Codec Avatars lab at Meta Reality Labs Research has been pioneering the development of lifelike avatars for immersive telepresence since 2015. By embracing open-source practices, the lab has collaborated with the research community to advance the field of metric telepresence.
Meta Reality Labs Research has made significant contributions by sharing datasets and baseline implementations for Codec Avatars, facilitating further research in this domain. Researchers can leverage the resources provided by Meta Reality Labs Research to explore various challenges in metric telepresence, including:
- Utilizing the Ava-256 dataset for end-to-end telepresence
- Exploring the Goliath-4 dataset for comprehensive captures of full bodies, hands, and faces
- Accessing high-quality recordings of facial expressions and 3D hair modeling
- Utilizing datasets for 3D interacting hand pose estimation and relightable 3D interacting hands
- Benefiting from high-quality indoor scenes for neural reconstruction
- Modeling 3D spatial sound of humans using body pose and audio
Meta Reality Labs Research aims to support the research community in pushing the boundaries of metric telepresence by fostering collaboration and knowledge-sharing.
For more information, you can visit the Codec Avatars at Meta Reality Labs Research page.