The PDF, which includes endnotes and footnotes, in which this article appears can be found in Bifocal, Vol. 45 Issue 6.
Introduction
There are many variations on the classic “scam call” - anything from a voice on the other line pretending to be a representative from your bank, to a robotic warning around your information being stolen and demanding payment to better protect your data. While such calls may seem obviously fraudulent to some, people continue to be victimized every year, and older adults are particularly vulnerable. In 2023, the FBI reported a 14% increase in the number of complaints of telephone scams filed to them by adults over the age of 60, with losses of US$3.1 billion in 2022 to US$3.4 billion a year later.
Older adults are particularly vulnerable to scam calls – many are more trusting of people who present themselves as having authority, power or influence. Also, comparatively to younger adults, older adults are more likely to lack confidence in their ability to identify a scam, or may have issues relating to cognitive decline, impacting their capacity to recognize malicious intent.Furthermore, older adults may experience loneliness or fears of abandonment, which may make them more likely than other age groups to continue to engage with individuals who may present themselves in a way that earns their trust.
With technological advancement, especially in the artificial intelligence (AI) space, perpetrators are evolving new and nefarious ways to obtain the trust, and eventually money or resources, of people of all backgrounds. With the rise in use and development of generative AI, scammers are profiting far more from scam calls than ever before, presenting further challenges to the financial and emotional wellbeing of older adults.
How do scammers use AI?
Generative AI is used in the production of scam calls through a phenomenon known as “voice cloning.” Generative technologies and programs use a wide dataset of stored media – anything from images, text, video, audio and more – available on the internet, publicly or privately, to create new content in response to a user’s prompt. One such widely known platform, ChatGPT, can be used in this way. You could write in a prompt or request to ChatGPT, like “create an image of a person standing on the Moon, watching the Earth,” and it will search through its database of similar images or other content, pull influences from that, to create a new image as close to what you had requested as possible.
Through this, some AI software can be used to generate new audio content based on existing recordings. Notoriously, these existing recordings are often taken from recorded calls, or audio/video content posted to social media or other online platforms. The new, cloned audio is intended to mimic an individual’s voice as closely as possible, to present the content as if it was an original statement made by that person. Audio and video versions of this phenomenon are known as “deepfakes.”
Often, celebrities are used as the source material of said deepfakes, especially given how much recorded material containing their voice and appearance is publicly available. High-profile incidents include the actor Tom Hanks having to warn fans that a video circulating of him advertising dental insurance plans was an AI deepfake,and actor Scarlett Johansson accusing the company OpenAI of using an “eerily similar,” and suspected deepfaked, version of her voice as the voice of their chatbot technology, despite her previous refusal to officially voice the program.
With traditional scam calls, the perpetrator would often have to present themselves as authentically as possible to be a representative of an official organization or government agency, seeking money or information from their target. Using AI voice cloning, perpetrators can impersonate celebrities, authority figures, and sometimes even loved ones or relatives of the target, and convince them to offer financial assistance or resources. Through the cloned voice, they can present themselves more deceptively than ever, to be in danger, in need, or exceptionally persuasively, to earn whatever they need from their target.