• Wion
  • /Technology
  • /AI tools now used for Scams: How to stay safe from cloned Voice frauds

AI tools now used for Scams: How to stay safe from cloned Voice frauds

AI tools now used for Scams: How to stay safe from cloned Voice frauds

Story highlights

AI voice cloning scams are on the rise in India. Fraudsters are now mimicking the voices of relatives or bosses to trick people into transferring money. With voice samples easily available online, these scams are becoming harder to detect. Stay alert; verify unexpected calls.

Indian scammers are now using artificial intelligence to copy voices and cheat people. This new method is making it easier for fraudsters to fool even smart and careful individuals. Here is how these scams work and what you need to know to stay safe.

What is AI Voice Cloning?

AI voice cloning is a technology that can copy someone’s voice after listening to just a few seconds of a recording. Scammers use this tool to make fake phone calls that sound exactly like someone you know, such as a family member or a boss.

How do the scams happen?

Scammers call people pretending to be someone they trust. They might say there is an emergency and ask for money. Sometimes, they use the cloned voice to ask for sensitive information like bank details or passwords.

In some cases, scammers use AI voice cloning to trick companies by pretending to be senior managers and asking employees to send money or share confidential data.

Cases in India

Recently, police in several Indian cities have reported cases where people lost money after receiving calls from voices that sounded like their relatives. The scammer pretended to be a family member and asked for money.

Why Are These Scams Increasing?

AI tools have become cheaper and easier to use. Scammers can find voice samples from social media or public videos. With just a short clip, they can create a fake voice and start making calls.

How Can You Protect Yourself?

Be careful if you get a call asking for money or personal information, even if the voice sounds familiar. Always double-check by calling the person back on their usual number. Do not share personal details on social media, as scammers use these to collect voice samples.

What Authorities Are Doing

The government has issued warnings about these emerging AI voice scams. Cybercrime units are actively working to track down the fraudsters and have urged the public to remain vigilant. The Indian Cyber Crime Coordination Centre (I4C), under the Ministry of Home Affairs, has released alerts highlighting the dangers of AI-generated voice scams.

Victims of such fraud can report incidents through the National Cyber Crime Reporting Portal athttps://cybercrime.gov.in/.