Facebook
This is a picture of the homepage of Facebook website in Spanish language taken in Buenos Aires, Argentina, May 10, 2012. Getty Images/ JUAN MABROMATA

A tech blogger claimed that the Spanish language algorithm of Facebook in Mexico was displaying inappropriate suggestions when one typed “girls” or “kids” in its crowd-sourced search engine.

Retweeting a screenshot from a Mexican Twitter user, Jane Manchun Wong pointed out the search suggestions included terms such as "girls in underwear", "girls in short dresses", "hot girls" and "naked 15-year-old girls.” She further clarified that inappropriate search phrases showed up only when one typed “girls” on the Facebook search in Mexico and not when the term “boys” was typed.

According to her profile on Crunchbase, Wong is a “tech blogger who has a history of uncovering yet-to-launch features or those still in testing through the use of reverse engineering tactics.”

In a couple of following tweets, Wong said one could not solely blame the social platform for the inappropriate suggestions. Since Facebook search was a crowd-sourced search engine, it generated suggestions based on the popularity of the terms among people who were frequently searching for them. Hence, the real issue lay with the mentality of the society in the region, she said.

“Machines and algorithms usually aren't created to be perverted, racist, or both. Facebook is coming off like a pedophile because people kept feeding them those sort of inputs. They are the representation of user's behavior and biases. It won't stop unless these biases are stopped,” she wrote in a tweet, adding in another, “In my opinion, 'fixing ML' in the search suggestion could only do so much on hiding the fact of people being objectifying. Those who searched for those sort of topics on Facebook are still gonna seek for it after the ML being fixed. It has to be fixed in societal level too.”

Facebook is yet to address the matter.

Meanwhile, the Telegraph recently reported that YouTube was recommending self-harm and suicide videos containing graphic images to users as young as 13 years. The news came days after British Health Secretary Matt Hancock said the U.K. government could consider banning multiple social media platforms if they fail to protect children from content depicting or promoting suicide and self-harm.

"We are masters of our own fate as a nation, and we must act to ensure that this amazing technology is used for good, not leading to young girls taking their own lives," Hancock said. He also wrote a letter to the major social networks - Twitter, Snapchat, Pinterest, Apple, Google and Facebook – warning them to “step up and purge this content once and for all.”

Although YouTube did not explicitly admit to or deny recommending self-harm content to British children, it released the following statement to Fox News on the allegations: “We know many people use YouTube to find information, advice or support sometimes in the hardest of circumstances. We work hard to ensure our platforms are not used to encourage dangerous behavior. Because of this, we have strict policies that prohibit videos which promote self-harm and we will remove flagged videos that violate this policy. Our policies also prohibit autocomplete predictions for these topics, and we will remove any suggestions which don’t comply with our policies.”

It is unclear if the social media platform has fixed its algorithm for the British audience. In the U.S., it displays the National Suicide Prevention Lifeline's phone number at the top of the page if one searches for a term like “suicide.”