After outcries from users and efforts by other social media platforms to ban the content, Reddit has finally taken action against a community in which users posted fake porn generated by artificial intelligence-powered tools.

The community, called deepfakes, contained spliced together content to create photo-realistic images and videos of celebrities and other people appearing in pornographic scenes. The community had amassed nearly 100,000 subscribers before it was shut down.

"Reddit strives to be a welcoming, open platform for all by trusting our users to maintain an environment that cultivates genuine conversation. As of February 7, 2018, we have made two updates to our site-wide policy regarding involuntary pornography and sexual or suggestive content involving minors,” a spokesperson for Reddit told International Business times.

“These policies were previously combined in a single rule; they will now be broken out into two distinct ones. Communities focused on this content and users who post such content will be banned from the site," the spokesperson said.

The newly split and more policies introduced Wednesday by Reddit explicitly prohibit posting involuntary pornography, which is defined as “images or video depicting any person in a state of nudity or engaged in any act of sexual conduct apparently created or posted without their permission.” The policy includes depictions that have been faked.

Reddit also updated its policy regarding sexually explicit or suggestive content that involves minors. The site does not allow child sexual abuse imagery, child pornography or fantasy content including stories or drawn images that place children in sexual situations.

The crackdown from Reddit comes as the so-called “Deepfakes” have started to populate the web. The images and videos are a technical achievement that essentially maps one person’s face over the top of another, making it possible to place someone in a video and have their face show the same expressions as the person they are mapped over.

Deepfakes require a graphics processing unit (GPU) from Nvidia, a fair amount of processor power and some technical knowhow to pull off. The trick uses Nvidia’s GPU to run an application that analyzes two sets of images and allow the user to map one image atop the other.

The powerful face-swapping tool has been used for a number of purposes, but its most nefarious is placing the face of celebrities and other people into porn scenes. Some of the results are very realistic and can be difficult to determine if the content is fake.

Reddit is just the latest platform to take action to remove deepfakes. Pornhub announced earlier this week that it would be removing any faked content from its site. Twitter has also taken action to block the computer-generated videos, as have image hosting sites Imgur and Gfycat and communications platform Discord.

Reddit’s decision to ban the community is just the latest in the company’s efforts to clean up the content that appears on its site. Last year, took action against a number of communities that hosted explicit content as well as racist, fascistic and white nationalist communities.