The Washington Post has published a deep dive into the sudden rise of frighteningly realistic, computer-generated pornographic videos that can swap virtually any person’s face into an existing online porn video — sometimes so seamlessly it can be difficult to tell there was any manipulation at all.

The videos are called “deepfakes” and while some sites like Reddit have tried to crack down on their posting, experts say the videos may be covered by the First Amendment, meaning there’s not much anyone can do to slow their progression.

At first, deepfakes creators targeted celebrities like Gal Gadot and Scarlet Johansson, the latter of which has spoken up about how violating it was. “Nothing can stop someone from cutting and pasting my image or anyone else’s onto a different body and making it look as eerily realistic as desired,” she said. “The fact is that trying to protect yourself from the Internet and its depravity is basically a lost cause …The Internet is a vast wormhole of darkness that eats itself.”

But that’s just the tip of the iceberg. According to The Post, deepfakes creators are also targeting women well out of the spotlight. Customers can pay creators to make realistic-looking pornographic videos of acquaintances, neighbors, co-workers and friends, and all they need a sizable repository of their target woman’s face: a repository like, say, Facebook or Instagram.

The technology used to make deepfakes was actually created by Google, which wanted to make a program in which “[t]wo opposing groups of deep-learning algorithms create, refine and re-create an increasingly sophisticated result.” At first, the technology was choppy and many deepfakes still have obvious glitches. But as creators continue to practice and the software improves, fakes are getting more difficult to spot and some look completely genuine. A popular clip from earlier in 2018 showed former President Obama saying things like “Killmonger was right.” But Obama hadn’t said any of it. It was a deepfake created by Jordan Peele. You have to look awfully close to tell that something in the video doesn’t look quite right – much closer than your average person is probably going to. (Peele wasn’t being malicious. The video was a PSA to raise the alarm about what deepfakes are capable of doing.)

“It’s like an assault: the sense of power, the control,” Adam Dodge, the legal director of a California domestic violence shelter called Laura’s House told The Post. “With the ability to manufacture pornography, everybody is a potential target.” Dodge helped train police officers on how some abusive partners are using deep fakes as weapons against their spouse, threatening to leak the “incriminating” videos to damage their reputation.

And this gets at just how disturbing the rise of deepfakes really is. These videos are more than just material for a random private porn stash. They can also be weaponized and used as leverage. As the technology continues to improve, experts warn the videos of virtually anyone doing anything could be cheaply created by software readily available online. Since the invention of video, we’ve learned to think of it as an arbiter of truth. “Video evidence” is considered practically airtight in both legal courts and the court of public opinion. But with the rise of deepfake technology, the future isn’t going to be nearly so simple.

“For folks who don’t have a high profile, or don’t have any profile at all, this can hurt your job prospects, your interpersonal relationships, your reputation, your mental health,” feminist media critic Anita Sarkeesian, who has herself been the target of a deepfake, told The Post. “It’s used as a weapon to silence women, degrade women, show power over women, reducing us to sex objects. This isn’t just a fun-and-games thing. This can destroy lives.”