First unveiled at last year’s Meta Connect conference, the feature allows content creators to translate videos into other languages while maintaining the original tone of voice, resulting in more natural and realistic translations.
The technology also includes lip-syncing, ensuring that mouth movements match the translated audio, giving videos a more convincing and polished appearance. According to TechCrunch, this represents a significant advancement in digital content creation.
In its initial phase, the feature supports translation only between English and Spanish, with additional languages to be added later. It is currently available to content creators with over 1,000 followers on Facebook, as well as all public accounts on Instagram in regions where Meta AI is available.
Creators can activate the new option by selecting “Translate Your Voice with Meta AI” before publishing a video. Viewers will see a notification indicating that the video has been translated using Meta’s AI technology.
Alongside translation, Meta has introduced a new tool in the Insights dashboard, allowing creators to track views by language. This helps measure the impact of translations on audience growth. Creators can also upload up to 20 dubbed audio tracks per video, expanding their reach across multiple markets.
Although Meta has not yet revealed which languages will be added next, the company confirmed that this move is part of its broader plan to enhance AI tools, alongside restructuring teams to focus on research, advanced AI, and cutting-edge products.
Instagram head Adam Mosseri said:
"We believe that creators have potential audiences who don’t speak the same language. By helping them overcome language and cultural barriers, we enable them to grow their follower base and get more value from our platforms."