Police said scamsters were now employing AI to clone voices; major potential threat to defence institutions, political leaders. The world of cyber crime fraud is evolving rapidly, with new scams making headlines ai most daily and frauds related to sharing OTPs, clicking on links and downloading fake apps becoming commonplace. The latest tread involves criminals leveraging artificial intelligence (AI) to defraud unsuspecting individual through voice scams.
Fraudsters carrying out voice cloning scams will record a person’s voice or find audio clip on social media or elsewhere on the Internet .Fraudsters claim distressed situations, such as being involved in an accident, losing wallets or mobile phones while travelling abroad,seeking urgent financial assistance.
An acquaintance,made a WhatsApp voice call that sounded remarkably like the real person.The scammer then informed the victim that sister-in-law had been admitted to a hospital in Nawanshahr, urgently heeding financial aid for treatment.
Technology experts view this trend as alarming,with the potential to escalate into even more serious crimes in the future.Fraudsters could exploit AI to impersonate celebrities or political leaders,making it challenging for people to differentiate between real and fake calls.
The potential misuse could be against scientists and defence officers,creating a serious global problem.
Unlike traditional mimicry where in the voice may not be 100% accurate, AI can produce voices that are indistinguishable from the original.AI systems to replicate them perfectly. By inputting a victim’s voice sample into the AI system,scammers can generate deceptive voice calls with uncanny precision.
With the rise of “deep fakes”created using AI,where the voices and videos of close or known individuals are exploited,such crimes may escalate,causing serious harm to society and public order.
Author Name: Sania,
Student Computer Science (BCA Pursuing)