It’s a short video. Keegan Michael-Key is at the AFIs, addressing an interviewer when Tom Cruise runs up behind him, leaps over his head in a single bound and stunts for the camera.
“Just want to jump in and wish you luck,” Cruise says. “Congrats on the AFI awards. Congrats on life. Congrats on the look. Work on the humor a little bit.” Then he offers a certified Cruise laugh and exits.
It’d be a pretty unhinged moment for anyone not named Tom Cruise but, wait …is it Tom Cruise?
When jokes fly over your head 😂✌️@Keegan-Michael Key
It’s close but as hundreds of thousands of viewers across Tiktok and other social media channels noted, it’s not quite there. Maybe it’s the voice, which is just a hair too high. Maybe it’s the body, which moves with just a little less precision than we’re used to from Cruise’s nuclear intensity. It’s not the face, which looks pretty much on the money but this video can’t quite escape the uncanny valley, even though it gets closer than many similar deepfakes.
It’s the creation of Miles Fisher, who has been honing his Tom Cruise impersonation for a while, with a little help from Belgian visual effects specialist Chris Umé. The pair use deepfake technology, which feeds thousands of images of whoever you’re trying to impersonate into a learning algorithm that then maps those images over the face of whoever’s doing the impersonation.
The quality of the result depends on a few factors, like how committed the impersonator is and how many photos you have to work with. But if you’re someone like Fisher, who has a pretty decent Tom Cruise voice, and you’re working with as many extant photos and videos as Cruise has, the results can be awfully convincing.
It’s pretty fun for something like this, and Fisher has built a big following on Tiktok with it. But experts are concerned about the technology’s possible dark side. Websites like Reddit have had to crack down on deepfake pornography, which can map anyone’s face onto sexually explicit videos. Celebrity women are an easy target for deepfake porn, and a few have spoken out about their horror and disgust at seeing their likeness hijacked for such clips. “Nothing can stop someone from cutting and pasting my image or anyone else’s onto a different body and making it look as eerily realistic as desired,” Scarlett Johansson said in 2018. “The fact is that trying to protect yourself from the Internet and its depravity is basically a lost cause …The Internet is a vast wormhole of darkness that eats itself.”
But even non-famous women have found themselves targeted by deepfakes. Customers can pay deepfake creators to make realistic-looking pornographic videos of neighbors, co-workers, exes and friends. All they need is time and a decent-sized catalog of pictures — like, say, Instagram and Facebook. These videos might become part of someone private porn stash, or they also could be used as blackmail. The law is still years behind sorting out the legal ramifications of deepfakes but for the time being, the effect on some women’s lives has already been very real.
And even that’s just the tip of the iceberg, as deepfake technology can be used for purposes far beyond targeting private individuals. Experts warn of a future in which deepfakes could be used during election cycles to make it look like a candidate said something they didn’t, or in war to make it look like an opposing political leader did something they hadn’t.
“For folks who don’t have a high profile, or don’t have any profile at all, this can hurt your job prospects, your interpersonal relationships, your reputation, your mental health,” feminist media critic Anita Sarkeesian, who has herself been the target of a deepfake, told The Washington Post. “It’s used as a weapon to silence women, degrade women, show power over women, reducing us to sex objects. This isn’t just a fun-and-games thing. This can destroy lives.”