The Double-Edged Sword of Voice Cloning

Contact center assistant

AI-powered voice cloning technology is revolutionizing customer experience across industries, as this cutting-edge technology enables businesses to create synthetic voices nearly indistinguishable from human speech.

This innovation opens up new possibilities for personalization and more efficient customer interactions that could dramatically improve customer experiences. For example, intelligent voice assistants that use voice cloning technology can help curb call volume in contact centers by answering FAQs and being constantly available, all while sounding exactly like a brand’s best human agent. As businesses seek to enhance their services and streamline operations, AI voice cloning presents exciting opportunities — and also raises serious concerns.

From high-profile scandals like Scarlett Johannson accusing OpenAI of stealing her voice to alarming studies revealing how easy it is to clone political figures’ voices, voice cloning’s rapid advancement opens the door to a myriad of malicious misuses. However, businesses that pursue the safe and transparent uses of this technology and individuals who educate themselves on the tech can help curtail potential abuse.

The evolution of voice

The process of creating a synthesized voice has evolved over time, with advancements in tech allowing for more efficient and cost-effective production. Previously, it was necessary to manually record every individual prompt with voice actors. Now, it’s possible to clone a voice with just three seconds of audio.

This is great for enterprises looking to, for example, leverage voice assistants for their contact center as a way to offer better service to customers when resources are limited. But it also opens the door for bad actors to misuse this technology for nefarious purposes, like scamming people out of money because the voice on the other end of the line sounds like a loved one in distress. In these cases, the perpetrator can pull audio snippets from social media or voicemail messages, then use AI to clone them and manipulate victims. In fact, 77% of people who have been the victim of an AI voice scam lost money because of high-tech deception.

Safeguarding against voice cloning misuse

To avoid being targeted by criminals misusing cloned voices, there are several best practices to keep in mind. If you get a call claiming to be from a loved one in trouble, the FTC recommends calling the person to verify the situation. It also advises that “if the caller says to wire money, send cryptocurrency or buy gift cards and give them the card numbers and PINs, those could be signs of a scam.” You can also proactively establish a code word or “security question” to use with family members to verify it’s really them in case of an emergency. In addition, there are some common clues to look out for to determine if the voice on the other end of the line is coming from a person or the result of voice cloning:

  • Indirect answers to questions, like answering a yes or no question with a narrative response
  • Interruptions without apologies
  • No empathetic tonal changes
  • Changes in the voice, e.g. sounding like multiple different people are talking
  • Repeating phrases exactly

By staying vigilant, verifying unexpected requests through alternative channels and educating yourself about the latest voice cloning technologies, you can significantly reduce your risk of falling victim to these increasingly sophisticated scams.

When it comes to voice cloning considerations for businesses, transparency is key. If someone calls a customer service line and thinks they might be talking to a bot, the quickest way to figure it out is to just ask! Though voice assistants may not immediately identify themselves as such, ethically-designed voice AI will confirm that they are bots if they’re specifically asked. This transparency facilitates the ethical, effective use of this innovative technology in customer service settings.

The rise of voice cloning technology presents both significant opportunities and serious challenges. By fostering a culture of responsibility and awareness, we can harness the benefits of voice cloning while mitigating its risks, ensuring a safer and more trustworthy digital landscape for all.

About the Author

Nathan Liu, VP of Deployments at PolyAINathan Liu is responsible for implementation and delivery teams at PolyAI, a leader is customer-led voice assistants for high-volume contact centers. Nathan has deployed over 500 bespoke enterprise voice assistants to PolyAI customers.

Leave a Comment