What to Do If Someone Steals Your Face

Heather McKinney
9 min readMay 23, 2021

No, Tom Cruise didn’t really say and do all that stuff you saw on TikTok

Every week in my newsletter, I answer a legal question from readers.

This week’s question is from Julissa on Instagram. Julissa asks:

So I’m seeing with all the apps you can use to swap your face with a celebrity’s face there are now TikToks of people just using the app and doing funny videos while pretending to be a celebrity like @deeptomcruise for example. My question is what kind legal protections exist to protect your likeness and how it can and can’t be used? Thank you!!!

Thanks for asking, Julissa!

As a lawyer fighting scams, I think Deep Fakes are the next biggest threat to scam victims, especially seniors. One way I tell seniors to try and avoid romance scams or celebrity scams is to ask to video chat with the person reaching out to them. Now, if scammers can put a celebrity’s face onto their own and pretend to be someone they’re not, how can we keep ourselves safe?

As a person who has seen Face/Off, I am equally as concerned about facial transplant surgery. But this question is about deepfakes, so let’s go with that.

Nicholas Cage being very weird in the film Face/Off
It’s Like Looking In A Mirror Only… Not.

What are Deep Fakes?

In case you are not familiar, the term “deepfake” can refer to manipulated media, either photo, video, or audio, that creates a false piece of new media. Scientists (or scammers) can use a special type of computing system to analyze photos, videos, or audio of a person to determine how to recreate them in the new media. For instance, when creating a deepfake video, the program may track what mouth shapes are linked to various sounds in order to mimic them.

These algorithms work best where there is tons of footage of a person — Tom Cruise, Tom Hanks, Tom Holland. Also other famous people not named Tom like Barack Obama, George W. Bush, and Hilary Clinton.

Deepfake versions of George W. Bush, Barack Obama, Hillary Clinton, Tom Hanks, Tom Holland, Robert Downey Jr. and Tom Cruise
None of these are real. NOTHING IS REAL!

It’s funny to watch a person who appears to be Tom Cruise fall down in an upscale store. Believe me, I’ve laughed at it. But it is scary how much cheaper and easier it is becoming to create more and more credible deepfakes. There are even apps now that allow you to perform a deepfake live during a video chat, making my advice to always ask for a video chat pretty useless.

Just compare the 2018 Barack Obama deepfake linked above with the 2020–2021 Tom Cruise deepfakes from TikTok. The Obama video is fun to watch, especially knowing it’s Jordan Peele behind the fake, but you can tell there’s something just beneath the surface that’s not quite real. On the other hand, the Tom Cruise videos are so spot on that the creator made a breakdown video just to show how it was done and calm people’s fears.

There is also this incredibly realistic and funny, but wildly inappropriate, video created by the South Park guys featuring our most recent former president and a handful of other celebs including Julie Andrews and Michael Caine. The voices are clearly silly exaggerations, but to my untrained eye, the videos seem flawless.

What kind legal protections exist to protect your likeness and how it can and can’t be used?

The good news is we don’t need to rush to make a bunch of new laws to keep up with this new technology. Good old fashioned common law can serve to protect you from being victimized by deepfakes in most cases. There are a couple of torts (tort = a reason for suing someone) you could use to recover after being the victim of one of these videos.

One such tort is called “false light.” False light is recognized in some states, though not Texas. According to the Restatement (Second) of Torts, when you sue someone for false light, you have to prove:

  1. The defendant/deepfaker published the information widely (i.e., not to just a single person, as in defamation);
  2. the publication identifies the plaintiff/you;
  3. it places the plaintiff/you in a “false light” that would be highly offensive to a reasonable person; and
  4. the defendant/deepfaker was at fault in publishing the information.

False light claims can be similar to defamation claims, which is actually why Texas doesn’t recognize false light as a cause of action. Texas courts have said that behaviors that other states would recognize as “false light” are covered under Texas defamation laws. Any expansion, the courts said, would have an impermissible chilling effect on free speech and run afoul of the First Amendment. So in Texas, someone may try to sue a deepfaker for defamation.

When bringing a claim for something like defamation, the law distinguishes a private figure from a public one. Private figures are ordinary, non-famous citizens. If a citizen who has no public persona sues for defamation, they would only need to claim that the bad actor was negligent regarding the truth or falsity of the defamatory statement at issue. On the other hand, public figures would have to prove that the bad actor knew the statement was false, or recklessly disregarded whether it was false.

This makes it particularly hard for politicians, who are generally considered to be public figures, to recover under defamation suits. However, if a deepfake is “fake” by its very nature, would it be so difficult to prove that the bad actor knew it was false? They created the falsity themselves.

New laws may also help politician-victims of deepfakes. In Texas, we now have our very own anti-deepfake law. In 2019, Texas became the first state to outlaw political deepfakes by statute, making it a crime to create videos “with intent to injure a candidate or influence the result of an election” that are “published and distributed within 30 days of an election.” California passed its own version of the bill in 2019 as well.

Some have warned that these laws are unconstitutional, but because the laws are new they have not yet been challenged.

It’s not just political videos to worry about. A 2018 study cited by the MIT Technology Review found that 90% and 95% of deepfake videos are not whimsical Tom Cruise gaffs or political videos but are, instead, nonconsensual pornography. Then, about 90% of those videos are nonconsensual porn featuring women, both famous and non-famous. Current revenge porn laws don’t cover deepfake pornography made without the subject’s consent.

Even so, other laws may be effective in stopping nonconsensual deepfake porn. Creators of these harmful videos could find themselves subject to criminal penalties like harassment, cyberbullying, or even extortion for making and distributing these videos without the subjects’ consent.

Can celebrities sue for deepfakes?

Not easily. Jay-Z found himself the subject of vocal deepfakes. The iconic rapper has such a unique way of rapping/speaking that a YouTube channel called Vocal Synthesis was able to upload videos of him supposedly rapping the “To be or not to be” soliloquy from Hamlet and the lyrics to Billy Joel’s “We Didn’t Start the Fire” (shout out Billy Joel!) Both videos were vocal deepfakes.

Hova’s legal team issued Digital Millennium Copyright Act (DMCA) take down notices to YouTube to remove the videos for violating copyright, but their requests failed. Why? You can’t copyright someone’s manner of speaking.

Both Jay-Z’s Shakespearean monologue and the Billy Joel bit would fall under Fair Use parody anyway, as would things like the Tom Cruise deepfakes showing a mad cap Cruise tripping and falling. Things change when someone tries to make money off the sound-alikes, though.

It doesn’t have to be a fake video posted online, either. In the late 1980s, McDonald’s introduced the Mac Tonight, moon-headed crooner who played piano and invited customers to enjoy late night meals. His singing style was a little too close to the then-deceased singer Bobby Darin whose estate sued McDonald’s for trademark infringement, causing McD’s to nix the commercials.

In a completely unrelated turn, Mac Tonight has since become an alt-right white supremacist meme because we apparently can’t have nice things. You’re welcome for that bizarre rabbit hole.

Along those lines, Texas and other states recognize the Right of Publicity — that is, the right of a person to make money off their name and likeness. It is actually considered a property right under statute. This law would protect someone from having a deepfake of them used for commercial purposes.

Under Texas common law, an individual could also make a similar claim for “misappropriation” which courts have broken down into three elements:

  1. that the defendant/deepfaker appropriated the plaintiff’s name or likeness for the value associated with it, and not in an incidental manner or for a newsworthy purpose;
  2. that the plaintiff can be identified from the publication; and
  3. that there was some advantage or benefit to the defendant.

So if (1) a deepfaker appropriated your name/likeness for the value — that is, to make money, (2) the fake media is identifiably you, and (3) the deepfaker is advantaged or benefitted by the deepfake, you could possibly prevail on a claim of misappropriation.

What can we do to stop deepfakes?

To sum it up, civil causes of action like defamation, false light, right of publicity, and misappropriation should work to protect most deepfake victims from having their faces used for inappropriate or commercial purposes. However, things like Fair Use/parody and high bars of recovery for defamation of public figures may make that difficult in some cases.

Criminal laws prohibiting harassment, cyberbullying, and extortion may also be used to help victims of deepfakes. However, the most common use for deepfakes — nonconsensual porn — isn’t explicitly covered by current revenge porn laws, not yet at least. States like Texas and California have passed deepfake laws to try and prevent meddling in elections by criminalizing deepfakes, but those laws may be on the constitutional chopping block by courts if ever challenged.

Of course, we don’t want to criminalize parody videos or restrict speech. Watching Tom Cruise slip and fall in a fancy store is hilarious. WE MUST PROTECT IT AT ALL COSTS!

But should there be some protection for the public from these tricky videos, photos, and audio? Probably, considering a company lost $243,000 when scammers used a deepfake voice to mimic a CEO’s voice and demand the hefty funds transfer. With the increasing accessibility of the technology and the unceasing motivation of scammers to steal money by any means possible, this is only the beginning.

Next time you watch a video, ask yourself, like Shakespeare would: “doth mine eyes deceive me?” Is that Shakespeare or was it the bard Shawn Carter? 🤔 Either way, we can’t trust photos/video/audio we see on the internet. We can ONLY trust what we see in person. If we see someone in person, we know it’s them and not an imposter, right? RIGHT!?!?

Nicholas Cage being weird in the film Face/Off again.
I want to take his face…off.

Oh no.

Thanks for the question, Julissa!

Got a question? Submit it here. They can be legal what-if questions, questions on current events, or questions about the legality of actions in TV shows or movies you’ve seen. I never ever want to answer your personal legal questions, so don’t send those. Love you, but I don’t do that.

Thanks for reading! If you loved it, please consider sharing it with others and subscribing. Don’t forget to listen to Sinisterhood, check out our shop, and support us on Patreon. Learn more about me here.

The opinions, language, findings, conclusions, or recommendations expressed are mine alone and do not necessarily represent the official position or policies of my employer or anyone else who may be affiliated with me. Don’t blame them. This is all on me.

--

--

Heather McKinney

writer • comedian • real life lawyer • co-host of Sinisterhood