That’s A Lot Of Videos; YouTube Removes 11.4 Million Videos Using AI Content Review

If you noticed that one or maybe all of your Youtube videos have been removed, well there is an answer to that.

During the second half of 2020, YouTube removed more videos than it ever has. That’s because the video sharing site leaned more on its algorithm in place of most of its human content moderators.

YouTube released the Community Guidelines Enforcement report earlier on Tuesday and it shows more than 11.4 million videos between April and June were all taken down.

“When reckoning with greatly reduced human review capacity due to Covid-19, we were forced to make a choice between potential under-enforcement or potential over-enforcement. Because responsibility is our top priority, we chose the latter — using technology to help with some of the work normally done by reviewers,” the company wrote in a blog post.

YouTube warned that that videos that would normally be fine on the platform may end up being removed in error because of the technology reviewers that they’re currently relying on.

YouTube also said in its blog post that it put in place stricter automatic rules in areas such as “violent extremism” and “child safety” – leading to an increase in video removals but it viewed the temporary inconvenience for creators as worth the end result as child safety overtook spam as the top reason for removal.

The company knew that removing more videos that didn’t violate its rules would also mean more appeals from content creators as a result. So it added more staff to its appeals process to handle requests as quickly as possible.

The number of appeals for content takedowns went from 166,000 in the first quarter of 2020 to more than 325,000 in the second. It also meant YouTube reversed itself and reinstated more videos in the second quarter: more than 160,000, compared to just over 41,000 in the first quarter.

YouTube said in its blog post that for sensitive policy areas such as child safety and violent extremism, it saw more than triple the number of removals as usual during the second quarter, but it viewed the temporary inconvenience for creators as worth the end result. “We accepted a lower level of accuracy to make sure that we were removing as many pieces of violative content as possible.”

 

 

 

 

Share this post:

Comment what you think!