Anúncio

Fla. Paul was targeted in elaborate AI ‘car crash’ scam – which nearly tricked his father into forgiving over $35k

A lawyer has issued a warning about an elaborate AI voice cloning scam that conned his father out of nearly $35,000.

Scammers impersonated Jay Shooster, 34, and called his father Frank, 70, convincing him that his son had been in a serious car accident, had been arrested and needed bail money.

Horrified Frank, a retired barrister, said he was convinced his son was “hysterical” and was deeply traumatized by the hoax.

Jay is running for Florida’s 91st District in the House of Representatives, and guess what, scammers managed to create a fake voice out of his 15-second TV campaign ad.

Anúncio
Jay Shooster, 34, has issued a warning about an elaborate AI voice cloning scam that conned his father out of nearly $35,000.
Courtesy of Jay Shooster / SWNS

Frank, also from Boca Raton, Florida, who was visiting his daughter in New York at the time, said: “Just as the Uber car arrived to take me to New York City, I got a call.

“It was my son, Jay. He was hysterical, but I recognized his voice immediately.

“He said he was in an accident, broke his nose, had 16 stitches and was in police custody because he tested positive for alcohol after a breath test.

“He blamed it on the cough syrup he had taken earlier.”

The impersonator, posing as Jay, begged Frank not to tell anyone about the situation on September 28.

Moments later, a man identifying himself as ‘Mike Rivers,’ an alleged attorney, called and said Jay needed a $35,000 bond to avoid being held in jail for a few days.

The scam escalated when ‘Rivers’ instructed Frank to pay the bond via a cryptocurrency machine – an unconventional request that raised Frank’s suspicions.

“I became suspicious when he told me to go to a Coinbase machine at Winn-Dixie,” Frank says. “I didn’t understand how this was part of the legal process.”

Frank eventually realized something was wrong after his daughter, Jay’s twin sister Lauren, and her friend discovered that AI voice cloning scams were on the rise.

Anúncio

Finally he hung up.

“It’s devastating to get a call like that,” Frank said.

“My son has worked so hard, and I was beside myself, thinking that his career and his campaign could be in ruins.”

Jay, who has appeared in such scams as a lawyer, was shocked to find himself a target.

Scammers impersonated Jay Shooster, 34, and called his father Frank, 70, convincing him his son had been in a serious car accident, was arrested and needed bail money, according to reports. Courtesy of Jay Shooster / SWNS

He speculated that the hoaxers may have cloned his voice from his latest campaign ad, which had aired on television just days before the incident.

Anúncio

“I’ve been paying attention to AI and its effects on consumers, but nothing prepares you for when it happens to you,” says Jay.

“They did their research. They didn’t use my phone number, which fit the story that I was in jail without access to my phone.”

The sophistication of the trick left Jay stunned.

“All it takes is a few seconds of someone’s voice,” he said.

“Technology is so advanced that they could have easily taken my voice out of my 15-second campaign ad.

“There are other videos of me online, so they could have used any of them to clone my voice.”

Jay is advocating changes to AI regulation to prevent such fraud from harming others.

“There are three key policy solutions we need,” he says. “First, AI companies need to be held accountable if their products are misused.

“Secondly, companies should seek validation before cloning someone’s voice. And third, AI-generated content should be watermarked so that it can be easily detected, whether it’s a cloned voice or a fake video.”

If elected to the Florida House of Representatives, Jay plans to take action against the growing misuse of AI technology, including voice cloning scams.

It aims to introduce legislation that will hold AI companies accountable for misuse, ensuring they implement necessary safeguards such as voice authentication and watermarking.

Jay is advocating changes to AI regulation to prevent such fraud from harming others. Courtesy of Jay Shooster / SWNS

“We need to create clear regulations to stop these types of crimes from happening,” says Jay. “It’s not just about the technology – it’s about protecting people from the trauma and financial damage that can result from these scams.

“I want to demand stricter requirements for AI developers to ensure their tools are not misused.”

As AI technology rapidly evolves, Jay and Frank hope their story serves as a warning to others to stay vigilant.

“It shows how important it is to stay calm and think things through,” notes Frank. “You have to listen and ask questions if something doesn’t add up. Scams like this are becoming more sophisticated, but we can’t let our guard down.”

#Fla #Paul #targeted #elaborate #car #crash #scam #tricked #father #forgiving #35k
Image Source : nypost.com

Deixe um comentário

O seu endereço de e-mail não será publicado. Campos obrigatórios são marcados com *

Rolar para cima