AI स्कैम का नया रूप: आपकी आवाज़ बनाकर ठगी कर रहे हैं साइबर अपराधी, ऐसे करें Deepfake कॉल की पहच


Deepfake name: Final month, a 43 -year -old advertising and marketing skilled in Bengaluru obtained a nervous name from his “daughter”. He informed that he’s within the hospital and ₹ 50,000 is required instantly. The voice was completely actual, the identical tone, the identical method and the identical solution to say “Appa”. He transferred cash with out considering. However in actuality his daughter was attending class in school.

This was not a movie story, however an AI generated deepfake name was a method that may copy anybody’s voice precisely.

At present, such voice scams are growing quickly in large cities of India like Bengaluru, Mumbai and Delhi. They’re concentrating on not solely to the aged, but additionally educated professionals, college students and even startup CEOs.

How does AI Voice Rip-off works?

Now scammers neither have to hack your telephone nor steal SIM. All they want is your 30 -second voice clip which they will simply decide up from Instagram Reel, YouTube video or WhatsApp forwards.

After this, they will make your voice in any language with AI instruments corresponding to Elevenlabs, Descript, or open-source voice cloning software program. By placing that voice within the script, it’s transformed into faux tales like medical emergency, police threats, financial institution loans or kidnapping. The caller ID will also be faux, in order that the entrance particular person feels that the decision is of a detailed.

Concern in information

Based on the report of the Cybercrime Coordination Middle (i4C) of the Indian Cybercrime Coordination in 2025, greater than 2,800 Deepfake name rip-off circumstances have been reported from January to Might. Their quantity has elevated by 200% in multic cities.

Most circumstances have been as a result of these causes:

  • Pretend name of troubled “household”
  • Menace within the identify of financial institution or police
  • Ask for information by turning into a faux employer

Bengaluru topped such circumstances, adopted by Mumbai, Hyderabad and Delhi NCR.

Who’s heading in the right direction?

It’s improper to imagine that solely elders are victims. These days, professionals, college students, utiubers and even startup homeowners are falling prey to them as a result of their voice is current on the Web in LinkedIn interviews, Instagram Reels, Podcast Clips, and so on.

A startup CEO from Hyderabad was about to pay the voice word of a “vendor”, however the rip-off was caught on the final minute video name.

Why is it extra harmful for India?

India’s linguistic range and household spirit makes such scams extra harmful. AI, not solely English, can even imitate tone and accent in languages like Hindi, Tamil, Marathi, Bengali and other people in India often contemplate the voice extra dependable. If somebody talks in voice like “son”, “boss” or “financial institution supervisor”, then most individuals rely with out considering.

Easy methods to determine Deepfake Name?

Background nois doesn’t trigger voice very clear.

If you happen to query the identical issues many times, the script loops.

Private questions get caught like a berth date or any inside factor.

Ask for video calls, the scammer will likely be lower instantly.

Use well-known apps corresponding to Whatsapp, Telegram, the place the identification of the callers is extra protected.

What are you able to do now?

Set up Caller ID apps like Truecaller or Hiya

Observe Cyber Dost (Authorities Cyber Consciousness Platform)

Lie your voice in public corresponding to lengthy Insta reels, podcasts and so on.

Document suspicious calls (the place authorized)

Complain the fraud instantly, name 1930 or file a grievance on cybercrime.gov.in

Additionally learn:

Easy methods to keep away from dishonest within the identify of low cost? Study the best way of protected and clever buying