We all think we won’t fall for a scam. But when scammers start using artificial intelligence, fake photos, and voice clones to impersonate Hollywood royalty? Suddenly, the game changes.
That’s precisely what happened in a real case involving a French woman and a scammer posing as Brad Pitt. Yes, that Brad Pitt. No, not just a catfishing attempt gone wrong, a full-blown, AI-powered, deepfake-fueled fraud that cost the victim nearly a million dollars. In the aftermath, we must wonder what legal tools we have to fight this new breed of deception.
How the Scam Unfolded
In 2023, someone claiming to be Brad Pitt's mother contacted Anne, a recently divorced interior designer living in France, on Instagram. “Brad’s” mom told her that Brad “needed a woman just like her.” Shortly after, she began an online relationship with someone impersonating the actor. Despite initial doubts, Anne explained that her lack of familiarity with social media left her vulnerable to manipulation. This wasn’t someone casually browsing dating sites or looking for trouble. One day, she got a message from someone claiming to be Brad Pitt’s mother. That strange introduction quickly turned into direct messages from “Brad” himself.
Over the next 18 months, the scammer drew Anne into what she believed was a private, emotionally supportive friendship with the actor. Detailed and personal messages helped sell the scam. The images? Convincing. Some included “Brad,” holding up signs with her name. Others showed him in casual settings, just chatting about life. Ultimately, they turned into requests.
A film project, a medical issue (kidney cancer), and an asset freeze in connection with his divorce from Angelina Jolie provided the background and set the stage. Each crisis came with just enough documentation, fake contracts, fake invoices, even screenshots of bank accounts to feel real. The scammers used AI-generated images and videos to create a convincing illusion. And so, little by little, Anne sent money. Anne transferred funds to cover customs fees for supposed gifts and even provided money for fake medical treatments. By the time it ended, she had sent about €830,000 (roughly $900,000 USD).
When images of the real Brad Pitt with his girlfriend, Ines de Ramon, surfaced in June 2024, Anne began to question the relationship. Eventually, the scammers’ demands for more money under the guise of “Special FBI Agent John Smith” prompted her to contact the police.
It Sounds Wild—But It’s the New Reality
It’s easy to read this story and think, “That could never happen to me.” But this wasn’t some fly-by-night phishing email with broken English. The scammers used cutting-edge AI tools to generate images, videos, and even voice messages that looked and sounded like Brad Pitt. They played the long game, built trust, and targeted her at a vulnerable moment.
And here’s the scary part: if it can happen to someone like Anne, it can happen to anyone. That includes your clients. It could even include you.
What Makes This a Legal Issue? Plenty.
While the headlines focused on the celebrity angle, this scam raises big legal questions, many of which don’t yet have clear answers.
1. Deepfake Impersonation Isn’t Just Creepy—It’s Criminal
The scam is based on identity theft. However, most existing laws were written long before AI could convincingly replicate someone’s face and voice. We’ll need legal updates fast to deal with synthetic media being used to commit fraud or impersonate real people.
2. Fake Legal Documents Are Getting More Convincing
The scammers didn’t stop at fake photos. They also fabricated legal agreements, medical bills, and even production contracts. These weren’t crude forgeries; they were polished, formatted, and plausible enough to fool someone with a strong head on their shoulders. That raises questions about document authentication, digital signatures, and due diligence in a world where seeing no longer equates to believing.