Youtube logo
After hosting disturbing content for kids on their platform, Youtube fumbles again. ERIC PIERMONT/AFP/Getty Images

YouTube tweeted an apology hours after one of the worst instances of autocomplete was reported Sunday on the Google-owned video platform.

A user shared a videograb with YouTube which shows her typing in the words “how to” on the site's search bar even as Youtube’s autocomplete suggestions crop up — from “how to tie a tie” to “how to make slime without glue."

But when the user added the word “have” after “how to,” up popped the autocorrect option, saying “how to have s*x with your kids.”

When the user brought it to the notice of @TeamYouTube, they issued the official apology Sunday evening that read, “This is an awful autocomplete result and we really appreciate you making us aware. We've removed it and will continue to investigate what caused this!”

The latest incident is among the many instances in which YouTube has been shown to mess up in some way with their primary video hosting service.

Recently, multiple major advertisers said that they would suspend their advertising campaigns on YouTube after they found that their ads were displayed alongside videos that had earned millions of views because they show children in compromising situations.

Also, many of the comments that these videos attracted were pedophilic in nature.

As per the Wall Street Journal, among the advertisers who have suspended their YouTube ads are such heavy-weights as Adidas, Mars Inc., Diageo and Captain Morgan.

The existence of such videos — and the kind of views and comments they received — got huge global attention when BuzzFeed reported on the same last week.

In their report, BuzzFeed mentioned a “vast, disturbing, and wildly popular universe of videos” among which were live-action videos showing children in bedclothes etc.

In reponse, YouTube took down many of these videos from their platform and said that they would enforce community guidelines more strictly.

In the wake of this, the implication arose that YouTube was abetting pedophiles by hosting such videos in plain sight.

Before this, Google was forced to remove content that promoted terrorism and extremist views, that too after they left such content unmonitored on the platform for years. YouTube also came under pressure recently to remove content considered disturbing from their YouTube Kids platform.

Among these were cartoons that depicted popular kids’ animated characters performing odd and violent acts.

It’s in this backdrop that the incongruous auto-complete fiasco has happened.

What’s happening with YouTube could be seen in the larger context of how social media platforms including Twitter and Facebook function.

While it’s true that the growth of such companies is driven a lot by their strong ethos supporting freedom of speech, the same ideology also helped attract to these platforms trolling, harassment and fringe content.

Algorithms have so far proven not too successful in curbing such problems. This points to the scenario by which a huge amount of human labor would be needed to effectively monitor these platforms.

So far, none of these platforms have come out with a well-laid plan to prevent such issues from arising in the future, at least none that the world knows of.