Scammers use deepfakes to create voice recordings and videos to trick victims’ family, friends

Minister of State for Home Affairs Sun Xueling said scammers can also use deepfake technology to clone authority figures. PHOTO: LIANHE ZAOBAO

SINGAPORE - Scammers are tapping sophisticated artificial intelligence (AI) tools to create deepfake voice recordings and videos of people, to fool their relatives and friends into transferring money.

Speaking at the Regional Anti-Scam Conference 2023 at the Police Cantonment Complex on Tuesday, Minister of State for Home Affairs Sun Xueling said scammers can also use deepfake technology to clone authority figures.

“We have already seen overseas examples of bad actors making use of deepfake technology to create convincing clones – whether voice or videos of public figures – to spread disinformation,” she said.

“As such, we need to constantly monitor this threat, work with research institutes, relevant government agencies, market players who themselves are at the forefront of these technologies, to study ways to counter them.”

Her comments come in the wake of a rise in AI-driven fraud, and amid reports of countries like China rolling out new rules to curb the use of generative AI to alter online content.

In May, police in a region of Inner Mongolia were alerted to a case where a scammer used face-swopping technology to impersonate a victim’s friend during a video call.

Believing that his friend needed to pay a deposit to complete a bidding process, the victim transferred 4.3 million yuan (S$805,000) to the scammer.

He realised he had been duped only after the friend said he knew nothing of the situation.

Meanwhile, there is growing concern in Europe of scammers using AI to recreate the sound of family members’ voices – also known as audio deepfake – to dupe people into transferring money.

Associate Professor Terence Sim, who is involved in research work related to deepfakes and other kinds of digitally altered images at the National University of Singapore’s Centre for Trusted Internet and Community, said: “When a victim sees a video of a friend or a loved one, he tends to believe that it is real and that they are in need of help. “There is an increased level of believability and deception behind such scams.”

Such deepfake technology has become easier to use over the years, which makes it all the more concerning, he added.

“All a scammer needs is a few photos of the target’s face, which can be taken from social media, to create a deepfake. That is scary,” he said, adding that deepfakes using audio alone may also be used to trick victims.

“A video clip of the target’s voice, which can be as short as 10 to 15 seconds, can be used to create an audio deepfake.

“Through machine learning, the technology obtains a sample of the target’s voice and extracts its characteristics to create what appears to be a phone call of distress. That is literally putting words into someone’s mouth,” he said.

Mr Adrian Hia, cyber-security firm Kaspersky’s managing director for the Asia-Pacific, said deepfake technology can be used to scam people everywhere.

“Deepfake campaigns, when done well, can be highly convincing as it plays into human emotions,” he said.

“Among most attacks, victims will find themselves being misled into behaviours centred around fear, excitement, curiosity, guilt and sadness. With emotions heightened, the intention is to make the victim act with urgency, forgoing rational thought,” said Mr Hia.

“Once a vulnerability is exposed, the attacker will exploit this weakness to reach their end goal of either financial pay-offs or reputational damage,” he added.

There are ways to spot a fake image. The lighting in a deepfake video may change from one frame to the next, and the person in it may exhibit jerky movements and blink in a strange manner or not blink at all, said Mr Hia.

Another telltale sign is when the speaker’s lips are not synchronised with his speech.

For audio deepfakes, Prof Sim said the public should pay attention to the words used.

“The scammer would not know the intimate details of your loved ones. If you know that is not a phrase they would use or how they would speak, that is a sign.

“At the end of the day, scammers do this to create a sense of urgency in the victim.

“It is always important to go through another channel and check with your loved ones before doing anything, as we are in an era where seeing should not necessarily mean believing,” said Prof Sim.

Join ST's WhatsApp Channel and get the latest news and must-reads.