Facebook will use photo-matching technologies to curb revenge porn on its site, the social media platform announced Wednesday.

Facebook announced new ways to prevent revenge porn from being shared on the platform, Messenger and Instagram.

How Facebook Will Curb Revenge Porn

It all starts when a user sees an explicit image shared without their permission. He or she can report the image using the dropdown menu on next to the post. After the image has been reported, specially trained representatives from Facebook’s Community Operations team will review the post and remove it, if it violates the platform’s policies. For most cases, Facebook will disable the account for sharing revenge porn without authorization, the company said.

Read: New Facebook Update Testing 'Rocket' News Feed Feature That Acts Like A Newspaper

The social media platform will start using photo-matching technologies to halt attempts to share intimate content on Facebook, Messenger and Instagram. If a person tries to share the content after it has been reported and removed, Facebook will alert the user that it violates the platform’s policies and that it has stopped their attempt to share the image.

If a user believes content was taken down by mistake, he or she can go through an appeal process.

Facebook also teamed up with safety organizations to offer resources and support to those affected by revenge porn.

For its tools against revenge porn, Facebook worked with the Cyber Civil Rights Initiative and other companies. The company also sought advice from the National Network to End Domestic Violence, Center for Social Research and the Revenge Porn Helpline, based in the United Kingdom. Facebook also got feedback from over 150 international safety organizations and experts.

“These tools, developed in partnership with safety experts, are one example of the potential technology has to help keep people safe,” Facebook said. “We look forward to building on these tools and working with other companies to explore how they could be used across the industry.”

Read: Nude Marine Photo Scandal: Illicit Content From Facebook Moves To Secret Snapchat Channels, Report Says

Facebook’s announcement comes after a major incident last month in which a group of Marines shared pictures and videos of females Marines without authorization in the Facebook group Marines United.

Apart from the Marine nude photo-sharing scandal, a 2013 study by the Cyber Civil Rights Initiative found 93 percent of U.S. victims of revenge porn report significant emotional distress, while 82 percent of victims report significant impairment in social, occupational or other important areas of their life. The study, which involved 1,606 individuals, found 23 percent of the respondents were victims of non-consensual photo-sharing.