Meta’s VR Avatars to Incorporate Tongue Movement Recognition1 min read
Meta, the parent company of Facebook, is pushing the boundaries of virtual reality (VR) interaction with its latest development in the VR avatar space. In version 60 of its software development kits (SDKs) for Unity and native code, Meta has introduced the ability to track tongue movement when using a VR headset. This advancement, part of Meta’s face-tracking OpenXR extension, aims to add a new layer of realism to VR avatars by simulating tongue movement during interactions.
Reportedly, while Meta’s Avatars SDK has not yet integrated this feature, third-party avatar solutions can leverage the updated SDK version to incorporate tongue movement tracking. The move reflects Meta’s commitment to creating more immersive and lifelike virtual experiences, emphasizing nuanced facial expressions as a crucial element of realistic interactions within the metaverse.
This development may seem unconventional at first, but it aligns with Meta’s broader efforts to enhance VR interactions, including the ongoing development of hyper-realistic avatars. The inclusion of tongue movement tracking is seen as a means to improve the authenticity of facial expressions, especially during speech.
As reported, Meta explores innovative technologies like brain-computer interfaces, the introduction of tongue tracking underscores the company’s dedication to pushing the boundaries of what is possible in the virtual realm. While some users may find this development surprising, it signifies Meta’s ongoing commitment to advancing VR technologies and creating more realistic and engaging experiences for users navigating the metaverse.
For more updates, be with Markedium.