A mother in Arizona became the victim of a horrific telephone scam that used an artificial intelligence (AI) tool to generate a cloned voice of her 15-year-old daughter.
Jennifer DeStefano received a call from an unknown number, which she decided to answer, fearing it since her daughter had been in an accident since she what out of town skiing.
“I pick up the phone, and I hear my daughter’s voice, and it says, ‘Mom!’ and she’s sobbing,” DeStefano said.
“I said, ‘What happened?’ And she said, ‘Mom, I messed up,’ and she’s sobbing and crying,” DeStefano recalled.
DeStefano said she heard a man’s voice say, “Put your head back, lie down.”
DeStefano’s confusion immediately turned into fear.
“This man gets on the phone and he’s like, ‘Listen here. I’ve got your daughter. This is how it’s going to go down. You call the police, you call anybody, I’m going to pop her so full of drugs. I’m going to have my way with her and I’m going to drop her off in Mexico,'” DeStefano said. “And at that moment, I just started shaking. In the background she’s going, ‘Help me, Mom. Please help me. Help me,’ and bawling,” she said.
The man proceeded to demand $1 million from DeStefano, then lowered it to $500,000 when she said she didn’t have the funds, The Daily Mail reported.
DeStefano kept the man talking while she was at her daughter’s dance studio surrounded by other worried moms, one of whom called 911.
DeStefano was able to confirm the call was a scam within four minutes.
“She was upstairs in her room going, ‘What? What’s going on? Then I get angry, obviously, with these guys. This is not something you play around with,” DeStefano said, knowing her daughter was safe.
The mom admitted she “never doubted for one second” he voice was her daughter.
“That’s the freaky part that really got me to my core,” that’s the freaky part about.
Computer science professor at Arizona State University specializing in AI, Subbarao Kambhampati, said voice cloning technology has become so advanced that it only needs three seconds of a person’s voice to clone it – which is what likley happened to the voice of DeStefano’s daughter.
“And with the three seconds, it can come close to how exactly you sound,” Kambhampati said. “Most of the voice cloning actually captures the inflection as well as the emotion.”
Kambhampati warned that deep learning technology has very little oversight and is easily accessed by the public.
“It’s a new toy, and I think there could be good uses, but certainly there can be pretty worrisome uses too,” he said.
Dan Mayo, the assistant special agent in charge of the FBI’s Phoenix office said the scammers find their voice targets on social media.
“You’ve got to keep that stuff locked down. The problem is, if you have it public, you’re allowing yourself to be scammed by people like this, because they’re going to be looking for public profiles that have as much information as possible on you, and when they get a hold of that, they’re going to dig into you,” Mayo said.
Mayo said that anyone who finds themself in such a situation should ask questions only the relative could know.
“You start asking questions about who it is and different details of their background that are not publicly available, you’re going to find out real quick that it’s a scam artist,” he said.