Microsoft is adding a powerful new feature to its Teams platform: voice cloning.
Starting in early 2025, users will be able to use the Interpreter in Teams to replicate their voices in up to nine languages, including English, French, German, and Mandarin Chinese.
This feature promises real-time, speech-to-speech translation that mimics your speaking voice for a personal touch in multilingual meetings.
Microsoft CMO Jared Spataro described the tool as enabling users to “sound just like you in a different language.” According to Spataro, it’s designed to make virtual interactions more engaging and personal. However, the tool raises significant security concerns.
How Voice Cloning Will Works in Teams
The feature will be available to Microsoft 365 subscribers and is designed to maintain privacy. Microsoft states that it does not store biometric data or add sentiments to voices.
The system replicates speakers’ messages “as faithfully as possible” without assumptions or extraneous information.
Also Read: Microsoft quietly releases Copilot AI Chatbot app for Android users
Voice simulation requires user consent, which can be enabled through settings or via notifications during meetings.
Microsoft isn’t alone in exploring voice cloning. Companies like Meta and ElevenLabs have also developed similar technologies for multilingual voice generation.
But voice cloning isn’t without challenges. AI translations often lack nuance and struggle with cultural idioms, making human interpreters still valuable.
The Cybersecurity Risks of Voice Cloning
Voice cloning introduces serious cybersecurity vulnerabilities. Cybercriminals have already exploited deepfake technology for fraud, including impersonation scams that cost victims over $1 billion last year.
A chilling example from earlier this year involved cybercriminals hosting a fake Teams meeting with executives, convincing their target to transfer $25 million.
Also Read: Here are some Windows features Microsoft killed in 2023
Even if Microsoft’s Interpreter limits how voice cloning is used, it’s not immune to abuse. A bad actor could, for instance, feed the tool misleading audio to impersonate someone and extract sensitive information in their target’s language.
Microsoft has yet to clarify the safeguards it plans to implement to address these concerns. But with the rise of deepfake scams, these measures must be robust.
Even OpenAI shelved its voice cloning tool, citing risks and ethical considerations.
Closing Thoughts
Microsoft’s Interpreter in Teams has the potential to transform multilingual communication. But this comes with a pressing need for enhanced security measures.
As the tool rolls out, you’ll want to consider both the convenience it offers and the risks it brings.
Voice cloning is a double-edged sword. While it can foster better global collaboration, it’s also a gateway for fraud and deception.
Microsoft must address these risks head-on if it hopes to avoid turning innovation into a cybersecurity disaster.