Technology, since it has been around, has been a double-edged sword, and as technology becomes more advanced, so does its potential for use, and abuse. And the abuse shows up in numerous, nefarious ways, deepfake videos being only the latest trend to hit the world.

For those who don’t know, the world hit by deepfake videos (or just deepfakes) is the often-sleazy one of pornography. Using artificial intelligence or machine-learning technology, the face of an adult performer is replaced with the face of another person, who could be a celebrity (as is usually the case) or an acquaintance or an ex. The video so produced, with a face that doesn’t actually belong on the body below it but still looks convincingly real, is called deepfake.

And this has become increasingly popular, given the relative ease of using the technology (there is an app) to create them. A large Reddit community (not to speak of other places on the internet where this is far more rampant) is dedicated to churning out deepfakes, and even accepts requests. But some other platforms have said they would not host deepfake content, and pornography website Pornhub is the latest to ban such videos.

In an email Tuesday to Motherboard, the website which receives 75 million visitors every day, all looking for titillation, said it considered the doctored videos nonconsensual, and therefore a violation of its terms of service.

“We do not tolerate any nonconsensual content on the site and we remove all said content as soon as we are made aware of it. Nonconsensual content directly violates our TOS [terms of service] and consists of content such as revenge porn, deepfakes or anything published without a person’s consent or permission,” a Pornhub spokesperson said in the email.

Pornhub poster The adult site will use AI to find performers and organize videos. Photo: Ethan Miller/Getty Images

The website has had for long a content removal page, where people can request for videos to be taken down, if the video featured people without their consent. But the ban on deepfake videos is a different matter, because it would likely mean the website would take down such content even if no one was filing a complaint against them.

That stands to reason, because deepfakes could be videos showing faces of people who have never actually recorded a sex act, and had therefore no way of knowing their faces were morphed onto bodies of other people performing sex acts. But how Pornhub would actually do this remains unclear.

There are still a number of deepfakes on the website, many of them clearly labeled as such, complete with celebrity names. Many of those videos have been posted in the last few days, as the Reddit community keeps putting them out.

Two other image sharing platforms, Gfycat and Discord, have previously announced they were also taking down all deepfake content hosted on them.

If these websites don’t decide to self-regulate in the way they have (and Reddit hasn’t), there is no legal recourse for people whose faces have shown up in deepfake videos. And there is plenty of misuse this technology can be put to, even outside pornography.