Artificial intelligence is playing a significant role in advancing technology, specifically the realness of deepfake audio and videos. The technology should be used for the good, like for the bonafide advancement, to promote the country’s digital infrastructure. However, if we see the current condition, it is the opposite. As the Deepfake becomes sophisticated, it is very easy for scammers to exploit the real voice of another person and con someone to harass them or trap them in a cyber fraud or crime.


Kerala’s recent incident of Deepfake fraud

In the recent case of Deepfake fraud, the scammer video-called a person by the name of Radhakrishna on WhatsApp. The scammer posed to be an ex-colleague of the victim, and the call received looked like the former colleague of the person from Andhra Pradesh. The person asking for 40,000 to help a relative in the hospital. After the 40,000, the scammer asked for 30,000 again after that, Radhakrishna got suspicious and decided to report the Cyber police about the incident.

AI is getting more powerful daily, and with that, it is also getting scarier. The scammers are using AI to fulfil their evil intentions. The scammer is using AI-generated tools to create voice Deep fakes and using photographs and making Deep fakes of that and using them in pornography. Posting the deepfake on pornography websites.

According to the McAfee report, 69% of people are unable to tell the difference between the real and Deepfake voice.

The scammers call the people posing to be the boss, as with the recent case of voice Deepfake. In the given case, the scammer called the person posing to be the boss of that person and asked for the transfer of money to the supplier. These types of office scams are on the rise.


Deepfake clothes removing applications

Recently news came up that scammers are using clothes to remove deep fake applications to harass people. The AI Deepfake is getting scarier as scammers using it against society. This news going viral on YouTube.

People are adopting artificial intelligence with the rise in the popularity of AI and AI-related tools. And it gets very easier to manipulate any image into anything. The scammers are misusing the tools. It was reported that 85% of people lose their money in cybercrimes. The scammers use AI-powered tools to scam people. there is a human study that says that person will detect a Deepfake voice only 73% of the time.



The No. of Scams is increased with the use of Artificial intelligence, AI is helping technology to advance. However, bad actors use it to scam people. In the above-mentioned case, the scammers are targeting people by using Deepfake to create a voice. The scammer scams the people by impersonating their known person. That is very dangerous to detect the voice. The Deepfake uses artificial intelligence synthetic media, making it sound like a real person’s voice or likeness to that person’s appearance.




Author: Himanshi Singh, Associate, Policy & Advocacy Team CyberPeace

Leave a Reply

About Cyber Peace Corps

Address: B-55 MIG, Ranchi Jharkhand, India
Phone: (+91) 82350 58865
Email[email protected]