With the COVID-19 pandemic forcing employees to work from home, YouTube increasingly relied on automated moderation during the second quarter of 2020 that prompted an uptick in automated policy enforcement But also resulted in the ouster of thousands of videos that were not guilty of any violations, the company said in a Tuesday blog post.

Roughly 11.4 million videos were taken down between April and June, with the vast majority featuring actual violations of the platform’s policies against nudity, spam, child abuse and encouragement of behaviors that could endanger young viewers. In addition, a large number of videos with no violations were swept up as well. Around 325,000 removals were appealed during the quarter, with almost half reinstated upon evaluation.

“When reckoning with greatly reduced human review capacity due to COVID-19, we were forced to make a choice between potential underenforcement or potential overenforcement,” the post stated.

YouTube also attributed the hike in removed videos to the greater number of users stuck at home making videos to pass that time. The company, however, declined to comment on exactly how many more videos were uploaded during the pandemic.

YouTube’s report on security echoes some of the points made in an earlier report from Facebook, which faced a similar dilemma after sending many human moderators home during the coronavirus outbreak. In Facebook’s case, it reported much different results, with fewer moderators resulting in less enforcement against sensitive and disturbing content.

youtube logo
A YouTube logo is pictured. AFP/LIONEL BONAVENTURE