YouTube has removed more videos in the second quarter of 2020 than ever before.
During the coronavirus pandemic, when the video-sharing site could not rely on its human moderators as much as previously, YouTube increased its use of automated filters in order to take down videos that could potentially violate its policies.
YouTube says that it usually relies on a combination of its employees and its algorithmic systems. Human reviews train the machine learning systems, which are capable of handling many more issues than people can.
The company said that during the pandemic it has “cast a wider net” than usual in order to ensure that content is quickly removed, rather than choosing to “dial back our technology and limit our enforcement to only what could be handled with our diminished review capacity”.
While YouTube’s content removal system is not necessarily more accurate, the company “for accepted a lower level of accuracy to make sure that we were removing as many pieces of violative content as possible”.
“This also means that in these areas specifically, a higher amount of content that does not violate our policies was also removed,” YouTube said in a blog post. That led to more content being removed “out of an abundance of caution”, it said.
The result of this policy meant that appeals on infringement of YouTube’s policies nearly doubled – rising from 166,000 in the last quarter to 325,000 in the second quarter.
The number of videos that went back online after appeal also rose dramatically, from approximately 41,000 to 161,000 in the same period.
As well as taking down videos, YouTube is using automated systems for other means – most recently rolling out the ‘Smart Reply’ feature that Google has in Gmail to YouTube creators.