The Rise of Voice Cloning Scams in India

The Rise of Voice Cloning Scams in India

On a frantic afternoon in Noida, Vrinda received a call from her son, or so it seemed. The voice on the other end begged for a quick transfer of Rs 60,000, claiming he was in danger and urgently needed the money. The urgency in the voice and the distant commands in the background created a vivid image of a crisis. However, something felt off when the voice called her “mummy” instead of the usual “mom.” Her concern grew when she heard what sounded like her sobbing child, making the situation alarmingly real. Fueled by fear, she transferred the money, only to later discover that it was a scam. Her son was never on the other end of the call – their voice had been convincingly cloned using sophisticated software.

Voice cloning scams, powered by Artificial Intelligence (AI), are on the rise in India, particularly in the Delhi-National Capital Region. Cybercriminals are exploiting this technology for extortion, resulting in a significant increase in cybercrime cases. Data from the National Crime Records Bureau reveals that cybercrime cases in Delhi alone surged to 685 in 2022, up from 345 in 2021 and 166 in 2020. McAfee conducted a survey that found that 66% of Indian respondents would likely react to voice or phone calls seeking urgent financial help, especially if the caller sounded like a close relative. The most convincing pretexts used by scammers included being robbed, involved in a car crash, losing a phone or wallet, or requiring funds during overseas travel. The survey also highlighted that 86% of Indians tend to share their voice data online or through voice messages at least once a week, making them more vulnerable to these scams.

These voice cloning scams not only cause financial damage but also inflict psychological harm. A report by the Future Crime Research Foundation states that online financial fraud accounted for a staggering 77.41% of all cybercrime cases reported from January 2020 to June 2023. Additionally, nearly 50% of the reported cybercrime cases were linked to transactions through the Unified Payments Interface (UPI) and internet banking. This highlights the vulnerability of digital transaction methods to fraudulent activities.

Prateek Waghre, Executive Director at the Internet Freedom Foundation, explains the deceptive strategies employed by scammers, stating, “Although these cloning tools have their limitations, scammers compensate by instilling a sense of urgency to overshadow these imperfections. To a large extent, the entire cybercrime scene hasn’t been well mapped out in India. Voice cloning scams can target individuals in new ways, not just as an unknown third party pretending to be a government agent. For example, now, individuals might receive calls from voices resembling those of their parents, bosses, children, or friends, asking for money or information. This complexity makes detection particularly challenging.”

Voice cloning technology has advanced to the point where only a few seconds of someone’s voice are needed to accurately replicate it. According to McAfee, even those with minimal experience can use this technology to create a voice clone that matches the original voice with about 85% accuracy. Romit Barua, a Machine Learning Engineer and Researcher from UC Berkeley, explains that voice cloning leverages advancements in audio signal processing and neural networks to replicate a person’s voice. There are two relevant forms of voice cloning: Text-To-Speech (TTS) and Voice Conversion. TTS converts written text into spoken words using synthetic voices, while Voice Conversion changes the characteristics of a voice in existing audio to sound like another person while preserving the original speech content.

Once scammers obtain an audio clip of an individual’s voice, they can use online services capable of mimicking that voice with high accuracy. Platforms like Murf, Resemble, and Speechify offer subscriptions ranging from $15 for basic access to $100 for advanced features. These platforms allow scammers to create convincing voice clones, though some nuances may be missed.

Scammers employ various tactics with voice cloning technology. One prevalent use is the Family Member in Crisis Scam, where scammers trick people into thinking that a loved one is in immediate danger or distress and urgently needs financial assistance. By mimicking the voice of someone close to the victim, scammers create believable scenarios of emergencies like accidents or legal issues. The emotional turmoil and supposed urgency impair the victim’s clarity of thought, leading to hasty actions without verifying the authenticity of the situation. Voice cloning technology has also given a new edge to kidnapping scams, as criminals can now mimic the voice of a supposed hostage, often a family member, to extort money or obtain sensitive information.

Prateek Waghre emphasizes that scammers obtaining detailed personal information is a significant concern. In some cases, the voice may not be clear but still convincing, while in others, the voice sounds exactly like the relative or friend of the victim. This raises questions about how scammers access such detailed personal information to create these convincing voice clones. Waghre also sheds light on a broader issue within the digital landscape – cybersecurity attacks that exploit human psychology, known as social engineering. These attacks manipulate individuals into voluntarily surrendering confidential data by leveraging emotional triggers such as fear, urgency, or empathy.

To protect against voice cloning scams, experts recommend several strategies. First and foremost, verification is crucial. When faced with unexpected requests involving urgent financial transactions or sensitive information, it’s essential to verify the authenticity of the communication through another channel. Another recommendation is to stay calm and composed when receiving a call claiming a loved one is in crisis. Taking a moment to gather pertinent details, ask probing questions, and request alternate methods of verifying the caller’s identity can help prevent falling for a scam. Activating the caller ID feature on smartphones is also important, as it provides notifications about incoming calls, including the caller’s identity and location. Lastly, establishing code words within families can be an additional security measure to verify the identity of callers.

As the prevalence of voice cloning scams continues to rise in India, it is crucial for individuals to stay informed about these manipulative strategies. By staying vigilant and taking necessary precautions, people can better protect themselves from inadvertently falling victim to cyber fraud.


Written By

Jiri Bílek

In the vast realm of AI and U.N. directives, Jiri crafts tales that bridge tech divides. With every word, he champions a world where machines serve all, harmoniously.