chevron-down Created with Sketch Beta.
September 06, 2024

Imitation is the Sincerest Form of Fraudulent Activity: Artificial Intelligence in Financial Scams Against Older Adults

Dinesh Napal, LL.M
The PDF, which includes endnotes and footnotes, in which this article appears can be found in Bifocal, Vol. 45 Issue 6.

Introduction

There are many variations on the classic “scam call” - anything from a voice on the other line pretending to be a representative from your bank, to a robotic warning around your information being stolen and demanding payment to better protect your data. While such calls may seem obviously fraudulent to some, people continue to be victimized every year, and older adults are particularly vulnerable. In 2023, the FBI reported a 14% increase in the number of complaints of telephone scams filed to them by adults over the age of 60, with losses of US$3.1 billion in 2022 to US$3.4 billion a year later.

Older adults are particularly vulnerable to scam calls – many are more trusting of people who present themselves as having authority, power or influence. Also, comparatively to younger adults, older adults are more likely to lack confidence in their ability to identify a scam, or may have issues relating to cognitive decline, impacting their capacity to recognize malicious intent.Furthermore, older adults may experience loneliness or fears of abandonment, which may make them more likely than other age groups to continue to engage with individuals who may present themselves in a way that earns their trust.

With technological advancement, especially in the artificial intelligence (AI) space, perpetrators are evolving new and nefarious ways to obtain the trust, and eventually money or resources, of people of all backgrounds. With the rise in use and development of generative AI, scammers are profiting far more from scam calls than ever before, presenting further challenges to the financial and emotional wellbeing of older adults.

How do scammers use AI?

Generative AI is used in the production of scam calls through a phenomenon known as “voice cloning.” Generative technologies and programs use a wide dataset of stored media – anything from images, text, video, audio and more – available on the internet, publicly or privately, to create new content in response to a user’s prompt. One such widely known platform, ChatGPT, can be used in this way. You could write in a prompt or request to ChatGPT, like “create an image of a person standing on the Moon, watching the Earth,” and it will search through its database of similar images or other content, pull influences from that, to create a new image as close to what you had requested as possible.

Through this, some AI software can be used to generate new audio content based on existing recordings. Notoriously, these existing recordings are often taken from recorded calls, or audio/video content posted to social media or other online platforms. The new, cloned audio is intended to mimic an individual’s voice as closely as possible, to present the content as if it was an original statement made by that person. Audio and video versions of this phenomenon are known as “deepfakes.”

Often, celebrities are used as the source material of said deepfakes, especially given how much recorded material containing their voice and appearance is publicly available. High-profile incidents include the actor Tom Hanks having to warn fans that a video circulating of him advertising dental insurance plans was an AI deepfake,and actor Scarlett Johansson accusing the company OpenAI of using an “eerily similar,” and suspected deepfaked, version of her voice as the voice of their chatbot technology, despite her previous refusal to officially voice the program.

With traditional scam calls, the perpetrator would often have to present themselves as authentically as possible to be a representative of an official organization or government agency, seeking money or information from their target. Using AI voice cloning, perpetrators can impersonate celebrities, authority figures, and sometimes even loved ones or relatives of the target, and convince them to offer financial assistance or resources. Through the cloned voice, they can present themselves more deceptively than ever, to be in danger, in need, or exceptionally persuasively, to earn whatever they need from their target.

How vulnerable are older adults to AI voice cloning scams?

While older adults may be especially vulnerable to deepfake audio scams, it is important to remember that the technology is so new, and thus people of all ages and backgrounds are susceptible to being harmed by their use in financial scams. It can be difficult for anyone to decipher whether a call is genuine, especially when it may sound like someone you love feeling scared, threatened or in desperate need for your help. Notable examples of older adults being specifically harmed by deepfake audio scams include:

  • In 2023, two grandparents in Saskatchewan received a call from someone presenting themselves as their “grandson” telling them that he needed cash for bail money after landing in jail. When they attempted to withdraw thousands of dollars from their bank to rescue him, their bank manager pulled them aside to inform them that it was likely to be a scam, given that another client had received a similar call and later discovered it had been a deepfaked clone of their relative’s voice.
  • In 2023, a grandparent in Newfoundland, Canada reported that their “grandson” had been arrested following a car accident and needed over CA$50,000 for bail. Unfortunately, she and many others were not able to evade the scam, with older adults in Newfoundland reportedly losing a combined US$200,000 to AI voice cloning scams.
  • In 2020, a well-established Philadelphia attorney with over 40 years of experience in legal practice was steps away from providing several thousands of dollars in bail money, to scammers using AI to impersonate three individuals – their “son,” their son’s “public defender,” and an “official” of the Montgomery County Court.

These incidents and the scale of victimization raise alarming questions and considerations. Older people may be particularly vulnerable due to the added layer of personalization being used by perpetrators through the AI generated impersonation of loved ones. This deepens existing vulnerabilities that old people may have to financial scams. They may be even more likely to act on the scammer’s request, or at least take fewer steps to discern whether the request is genuine, because they may be motivated by fear of harm or distress being inflicted on their loved one, or be motivated by feeling special in having been contacted by a public figure or celebrity.

Challenges presented by AI deepfakes or voice clones in financial scams have been prevalent enough to warrant the US Senate Committee on Aging holding a hearing in November 2023. At the event, many older victims shared their stories of being victimized by financial scams involving AI voice cloning. Chair of the Committee, Senator Bob Casey (D-Pa.) urged for federal guardrails to protect consumers from such scams.

What can older adults do to avoid such scams?

While addressing the spread and challenges presented by AI voice cloning may need a multi-disciplinary, cross-stakeholder effort and potentially federal action, there are small steps that individuals can take to further protect themselves if they ever receive a scam call:

  • Take a step back and think before acting – consider trying to call your relative or loved one back on their known contact number, or call another family member or close friend to let them know what has happened and ask for their support.
  • Report any scams to your financial services provider and to your Congressperson – this can help with spreading awareness of the issue, and address any scams that have particularly targeted your local community.
  • Establish a “code word” or secret phrase with your close family, friends or community – an AI voice clone might be able to impersonate your loved one, but might not know something specific to your life or the relationships you have. You could establish a secret question, phrase or word, such as knowing the correct answer to “what street did I grow up on” or a specific word you can ask them for if they’re really in danger.

Further resources for promoting awareness of AI scams, financial fraud and protecting older adults are also available:

Entity:
Topic:
The material in all ABA publications is copyrighted and may be reprinted by permission only. Request reprint permission here.