AI Deepfake Romance Scam Drains Woman’s Savings and Forces Home Sale
LOS ANGELES, Calif. — In a chilling example of how artificial intelligence is reshaping the landscape of cybercrime, a Southern California woman was defrauded out of her life savings and forced to sell her home after falling victim to a sophisticated AI deepfake scam. The scam involved a convincing impersonation of Steve Burton, a longtime star of the soap opera “General Hospital,” using AI-generated video and audio to fabricate a romantic relationship.
Abigail, the victim, was initially contacted through Facebook by an individual claiming to be Burton. The messages were intimate and personal, and the voice and video appeared authentic, convincing Abigail that she was engaged in a genuine romance. According to her daughter, Vivian Ruvalcaba, the scammer quickly moved the conversation to WhatsApp, where the deception escalated.
“She was directly messaged by an individual who used deepfake technology to mimic a familiar face and voice,” Vivian explained during an interview on the “Beyond Connected” podcast hosted by cybersecurity expert Kurt Knutsson. “The messages felt real, the love felt personal, and by the time we realized what was happening, more than $81,000 was gone — along with her paid-off home.”
The scam’s impact was devastating. Abigail had planned to retire in her condominium, but the financial losses forced the family to sell the property. Vivian has since become her mother’s advocate, investigating the scam and raising awareness about the dangers of AI-driven fraud.
This incident underscores the growing threat of AI-enabled scams, which are increasingly difficult to detect. The Federal Bureau of Investigation has issued warnings about the rise of deepfake technology being used in romance scams, urging the public to exercise caution when interacting with unknown individuals online.
Experts at the Federal Trade Commission have also highlighted the surge in AI-generated scams, emphasizing the importance of verifying identities and being skeptical of unsolicited romantic advances, especially those involving requests for money.
The Department of Justice’s Computer Crime and Intellectual Property Section is actively pursuing cases involving AI fraud, working in coordination with other agencies to develop tools and strategies to combat these emerging threats.
Victims like Abigail often face not only financial ruin but emotional trauma, as scammers exploit trust and affection to perpetrate their crimes. The case has prompted calls for increased public education on recognizing deepfake content and for technological safeguards to detect and prevent AI-based impersonations.
As AI technology continues to advance, law enforcement agencies including the Cybersecurity and Infrastructure Security Agency are urging individuals to remain vigilant, verify identities through multiple channels, and report suspicious activity promptly.
Abigail’s story is a stark reminder of the real-world consequences of AI misuse and the urgent need for enhanced protections against cyber-enabled fraud.

Leave a Reply