Deep Fake Audio Technology is the Scammers and fraudsters New Way

Featured Image

Scammers can now imitate the voices of your loved ones to trick you into sending cash to the crook. Deep Fake Audio Technology is the scammer and fraudsters new way.

Deep Fake Audio technology enables criminal to replicate and recreate voices to send anything they want. The simple explanation is that crooks have started imitating people you know to trick you into transferring cash to them.

They do this by processing audio one have made publicly available. The views of Text-to-speech specialist CereProc to Sun UK has the following to say about the new scam:

“If you or your loved ones have posted videos with sound or audio files on social media, the scammer can get hold of it that way.”

“Again, they can go “old school” to get the voice samples by something like identifying a couple calling one partner a few times, keeping them on the phone and recording their answers. They can say they are doing market research.”

“Then the hacker would simply clone the voice using Artificial Intelligence (AI) technology and use it against the other partner.”

“Depending on the quality of the original audio file. it might not sound exactly like your loved one but it can still convince one.”

Matthew Aylett chief Text-to-speech specialist says, ” Deep Fake Audio can be weaponised by criminals and gangs to defraud  victims and sabotage business activities through telemarketing.”

“It is a very real threat and there is no solution yet. By using deep-fake audio to replicate loved one’s voice, victims could be duped into sharing their bank account details or transferring money to a third party. This is especially concerning for the elderly who aren’t tech-savvy as younger generations.”

Reports reveal that some celebrities have been victims. It is the audio version of deep-fake videos, which have targeted celebrities such as Emilia Clark and Natalia Dormer to create fake porn.

Now that there is no solution yet, the following advice had been offered to stay safe:

  • When making a purchase, be suspicious of any request to pay by bank transfer or virtual currency instead of safer methods such as credit card or payment services such as PayPal.
  • Listen to your instincts: if something feels wrong, then it is usually right to question it.
  • Don’t pay for goods or services unless you know and trust the individual or business.
  • Personal information obtained from data breaches is making it increasingly easier for fraudsters to create highly targeted phishing messages and calls -please watch out for these.
  • You shouldn’t assume the caller is genuine just because they are able to provide some basic details about you.
  • Always be suspicious of unsolicited requests for your personal or financial information.
  • Further, it is advised that since the technology struggles to converse as effectively as humans. you have to check if they can answer your queries.
  • It is also helpful to hang up and call the number back and see who or what answers.

Mr Aylett further advises, “we must not underestimate the depth that criminals can sink to get what they want. even if it means impersonating your family members to manipulate victims into handing over their bank account information or transfer savings.

 

 

 

 

 

Be the first to comment

Leave a Reply

Your email address will not be published.