Meta has launched an AI-powered voice translation tool that allows creators to dub and lip-sync their Reels in new languages, expanding their global reach.
Rolled out today, the feature not only translates words but also mimics the creator’s voice and synchronises lip movements, giving the impression they are speaking the translated language.
“Meta AI translations let you speak to viewers in their own language, opening your content to new audiences,” the company said.
For now, translations support English-to-Spanish and Spanish-to-English. Eligible users include Facebook creators with at least 1,000 followers and all public Instagram accounts.
On Facebook, creators can activate the feature before publishing, preview translations, and opt in or out of lip-syncing. Once live, Reels automatically appear in the viewer’s preferred language, with performance insights available.
Meta added that creators can also upload up to 20 dubbed audio tracks to reach audiences beyond English- and Spanish-speaking markets. Instagram chief Adam Mosseri said the goal is to “help creators grow their following and cross cultural and linguistic barriers.”
The development comes as Nigerian creators gained eligibility for Facebook content monetisation in July 2024, with In-Stream Ads and Ads on Reels enabling them to earn revenue from original video content.