Phone

+123-456-7890

Email

mail@domain.com

Opening Hours

Mon - Fri: 7AM - 7PM

In recent months, the use of artificial intelligence (AI) has become increasingly controversial, with voice deepfakes creeping into the music industry and sparking debate around the world.

Voice filters powered by AI Drake, The Weeknd, Jay-Z But even more surprising, some scammers are believed to have used voice cloning technology to set up more realistic kidnapping scams.

Jennifer DeStefano told American media that she nearly fell victim to a virtual kidnapping scam, even though she thought “mothers know their children.”

On January 20th, an Arizona resident received a call from an unknown number while trying to pick up her daughter Aubrey from the dance studio.

She tried to turn down the call, but thought the call might be about medical emergencies, as her eldest daughter, Brianna, was training ski races with her father in a northern Arizona resort more than 100 miles away. thought.

DeStefano told CNN that when she answered the phone, she heard screaming and sobbing, and “the voice sounded like Brie[Brianna]’s voice and all the inflections.”

“Suddenly, I heard a man say, ‘Lie down and put your head back.’ I panicked.”

Then she heard a man tell her that she had a daughter. He warned: I’m going to drop her off in Mexico my way with her and you’ll never see her again.

The man also demanded a ransom of US$1 million.

DeStefano ran into the dance studio and screamed for help. A woman helped her call the authorities.

Thankfully, Ms. DeStefano managed to get in touch with her husband to confirm that Briana was with her and unharmed.

She also told Briana, who said she was in bed and didn’t know what was going on.



https://www.straitstimes.com/world/everything-was-so-real-virtual-kidnapping-scams-made-more-realistic-with-ai ‘Everything was real’: Virtual kidnapping scams made more real with AI

Recommended Articles