The PDF in which this article appears can be found in Bifocal Vol. 46 Issue 5.
June 24, 2025
Behind the Screen: Elder Financial and Technology Abuse in the Age of AI
By Steven Bradley/Safety and Technology Expert, OurFamilyWizard
The last decade has seen an unprecedented rise in financial and technology-related crimes against the elderly. As a former law enforcement investigator who spent years unraveling fraud cases, I can attest that the evolution of these crimes is not just alarming but also increasingly sophisticated. Today, the integration of artificial intelligence (AI) into everyday technologies has created both opportunities for safety and new frontiers for abuse. The elderly, often less familiar with emerging tech, are tragically among the most vulnerable.
The Scope of the Problem
Elder financial abuse is not a new phenomenon. However, its migration into the digital world has made it harder to detect and easier to execute. According to the FBI's 2023 Elder Fraud Report, Americans over 60 lost more than $3.4 billion to scams in that year alone—a 14% increase from 2022. The true numbers are likely higher, as many victims never report the crime out of embarrassment or fear.
Scammers today don’t just rely on sweet-talking or intimidating victims over the phone. They use spoofed caller IDs, phishing emails, fake websites, and even deepfake videos to impersonate family members, government officials, and tech support agents. AI has now entered the scene, amplifying the scope and believability of these attacks.
The New Face of Scams: Artificial Intelligence
AI tools can be used for good—speech-to-text services, fall detection, health monitoring—but in the wrong hands, AI becomes a powerful weapon. Here are some ways it is already being used to exploit the elderly:
- Voice Cloning: Scammers now use AI to clone the voices of victims' family members. In one recent case, a grandmother received a panicked call from someone who sounded exactly like her grandson, claiming he was in jail and needed bail money. She wired $5,000 before confirming his whereabouts.
- Deepfake Videos: AI-generated video content can convincingly mimic a person's face and speech patterns. A deepfake of a known public figure, like a Social Security official or Medicare representative, may be used to gain trust and encourage victims to divulge personal information.
- AI Chatbots and Phishing: AI-generated emails and messages are more sophisticated, personalized, and grammatically correct, making phishing attempts more convincing. Chatbots can now respond in real-time, mimicking human support agents and luring victims into scams.
- Predictive Targeting: Machine learning algorithms sift through leaked or sold data to identify likely victims based on age, location, and online behavior. This profiling allows scammers to target the most susceptible individuals with surgical precision.
Case Studies from the Field
During my tenure, I encountered numerous heartbreaking stories. One case involved an 82-year-old veteran who lost $97,000 in a cryptocurrency scam. He believed he was investing in a government-backed digital currency. The website looked official. The emails were signed by "SEC agents". When I investigated, I found that the entire operation was run by an offshore ring using AI to auto-generate emails, fake compliance documents, and even simulate online account growth.
In another case, a retired schoolteacher began receiving "tech support" calls warning of malware on her computer. The voice on the other end was calm, professional, and sounded reassuringly American. It was actually a synthesized AI voice operated by fraudsters overseas. They convinced her to install remote access software, giving them full control of her computer and financial accounts.
Why Elders Are Prime Targets
Several factors contribute to the high susceptibility of older adults:
- Trusting Nature: Raised in an era of face-to-face interactions, many older adults are more trusting and polite—traits scammers exploit.
- Lack of Digital Literacy: Navigating digital landscapes is challenging, especially with evolving threats like fake browser pop-ups and realistic phishing pages.
- Isolation: Many elders are lonely and may respond to unsolicited communication simply for human interaction.
- Cognitive Decline: Even mild cognitive impairment can make it difficult to assess the credibility of requests or to remember warnings.
The Role of Caregivers and Institutions
Protecting older adults is a community responsibility. Family members and caregivers must stay alert to changes in behavior or financial patterns. Banks and credit unions should enhance fraud detection systems that flag unusual transactions. Tech companies, especially those developing AI tools, must implement guardrails to prevent their misuse.
What Can Be Done Now
- Education: Awareness campaigns tailored for seniors, caregivers, and professionals should be a top priority. Programs must be delivered through senior centers, churches, libraries, and community events.
- Technology with Ethics: Developers must integrate ethical guidelines and protective features into AI systems. For example, AI voice cloning software should watermark or flag synthesized content.
- Legislation and Reporting: Stronger penalties for elder financial abuse are needed, along with mandatory reporting requirements for suspected abuse by banks, tech companies, and health providers.
- AI for Good: Just as AI can be used to harm; it can also help. Predictive models could be trained to detect scam patterns and alert institutions. Virtual assistants can be programmed to warn users about suspicious emails or calls. Face-recognition doorbells can prevent fake utility workers from gaining access to homes.
The Future: A Double-Edged Sword
As artificial intelligence continues to evolve, so too will the nature of elder exploitation. We may soon see AI-driven romance scams that adapt emotionally to the victim's responses, or holographic AI impersonators capable of appearing in video calls. But we may also see intelligent monitoring systems that detect abnormal financial activity or cognitive decline in real-time.
What remains certain is this: we must treat elder financial and technological abuse not only as a personal tragedy but as a public crisis. It is a matter of dignity, justice, and the societal obligation we hold to those who raised us. Technology should be a bridge, not a barrier. AI should be a shield, not a sword.
Let us ensure that the tools of the future are shaped by values from the past: respect, compassion, and the unyielding commitment to protect the vulnerable.
Resources for Families and Professionals
- National Elder Fraud Hotline: 1-833-FRAUD-11
- AARP Fraud Watch Network: www.aarp.org/fraudwatchnetwork
- FBI Internet Crime Complaint Center (IC3): www.ic3.gov
- Cybercrime Support Network: www.fraudsupport.org