You are currently viewing The Invisible Danger: Exploring the World of AI Virtual Kidnapping

The Invisible Danger: Exploring the World of AI Virtual Kidnapping

In recent years, the proliferation of Artificial Intelligence (AI) technology has brought forth a new wave of criminal activity known as AI virtual kidnapping. This form of extortion relies on sophisticated algorithms to impersonate kidnappers and exploit the fears of victims and their loved ones. As this nefarious practice evolves, it presents complex challenges that demand attention and action.

What is Virtual Kidnapping

Virtual kidnapping can be described as a type of extortion scam where a malicious actor contacts a person, claiming to have kidnapped a loved one. Previously scammers would make use of names and personal information of the loved one found on social media or public websites to make the kidnapping seem legitimate.

However, with the rapid advancement of artificial intelligence (AI) technology, crimes such as Virtual Kidnapping have taken on a new form. There have been instances where a deepfake voice of the victim’s loved one was used as proof that the scammer had the loved one in their possession. Deepfakes transforms existing source content where one person is swapped for another, however, it can also create original content where someone is represented doing or saying something they did not do or say.

The scammer, who under the pretense has already kidnapped their victim’s loved one, will then usually demand a ransom payment.

In January 2023, a resident of Arizona in the USA, Jenifer DeStefano, received a call from an unidentified number. Upon answering the phone, she heard what appeared to be her 15-year-old daughter saying, “Mom, these bad men have me. Help me. Help me. Help me.” A man then took the phone and told her that they would drug her daughter, harm her, and transport her to Mexico if she spoke to anyone, the scammer then demanded a ransom amount of $1 million which was reduced to $50,000 after a brief negotiation. Mrs. DeStefano was able to verify that her daughter was on a skiing trip with her father and that she was unharmed and not kidnapped before the ransom was paid. It was later revealed that the scammers used Artificial Intelligence to mimic her daughter’s voice in an attempt to extort money.

Meanwhile, while virtual kidnappers may not actually have any access to the victim or their loved one, virtual kidnapping is believed to have a psychological impact and trauma on victims through the belief that a loved one is in real danger.

What can you do if you receive a distressing call from a loved one?

There is software out there that can used to identify deepfakes. To distinguish between real and fake audio, the detector uses visual representations of audio clips called spectrograms, these are also used to train speech synthesis models. 

When you are listening to the call it might not seem possible to tell it apart from the real person, however, voices can be distinguished when spectrograms are analyzed side-by-side.

Unfortunately, many people will not be able to generate spectrograms so what can you do when you are not certain what you are hearing is the real thing?

  • The first thing you can do is if you receive a call from a loved one out of the blue and make a request that seems out of character, call them back or send them a text to confirm that it really is them you are talking to. If they are not responding, it is important to contact the police or other relevant authorities as soon as possible.
  • If the scammer has requested a ransom, do not make any agreements or send any money. Contact the police instead.
  • Gather as much information as possible from the scammer to help the police with their investigation to identify them. Save any evidence or screenshots of communication with the scammer.
  • Additionally, you should keep a record of any contact you have with the police.

It is important to mention that Cybercrimes Act 19 of 2020 regulates the powers of the South African Police Service to investigate cybercrimes and the Act also makes provision for the establishment of a designated Point of Contact (a cybercrimes unit) to enforce the law by way of providing assistance relating to the proceedings or investigations regarding the commission or intended commission of cybercrime. However, the Standard Operating Procedures (SOPs) and/or designated Point of Contact (POCs) aimed towards facilitating such enforcement have not yet been established.

There are, however, portals such as Cybercrime.org.za that provide resources, information, and reporting mechanisms for cybercrime.

Conclusion

As the capabilities of AI expand, the lines between reality and fiction will increasingly blur. And it is not likely that we will be able to put the technology back in the box. This means that people will need to become more cautious.

We all want our children to grow up using the internet and social media safely and responsibly, therefore it is crucial to understand the potential dangers while enjoying the fun side of it.

Please note that the information provided in this blog post is general in nature and should not be construed as legal advice. For specific legal guidance, we encourage you to reach out to our team of experienced attorneys.

Leave a Reply