To allow the YouTube platform to remain open to all, the Mountain View giant maintains a continuous cleanup effort against videos that contain harmful content and removes them as soon as they are detected.
?Harmful content has been removed since YouTube started, but investment has been accelerating in recent years. Thanks to this ongoing work, in the last 18 months, video visits that were later eliminated for violating platform policies have been reduced by 80%, ?says the company.
YouTube says it has already removed over 100,000 videos with hate speech, closed 17,000 channels and 500 million comments.
Policy development for a global platform
Before removing any content that violates YouTube policies, Google explains that I need to ensure that the line separating what is removed from what is allowed is in the right place to preserve freedom of expression and the community.
- Since 2018, we have introduced dozens of updates to our standards: many of them are minor clarifications, but others are deeper.
- Our update on hate speech represented a fundamental change in our policies. We spent several months developing this policy and working with our teams to create the tools and establish the training needed to ensure compliance.
- In April 2019, we also announced that we are working to update our harassment policy, including creator harassment. We will share our progress on this issue in the coming months.
Using technology to detect harmful content
- In 2017, we increased the use of our machine learning technology to help detect content that potentially violates our policies and submit them for review by a human team.
- More than 87% of the 9 million deleted videos in the second quarter of 2019 were first detected by our automated systems.
- We are investing significantly in our automatic detection systems, and our teams of engineers continue to update and improve them month after month. For example, an update to our spam detection system in the second quarter of 2019 led to a 50% increase in excluded channels for violating our spam policies.
Remove content before it becomes popular
- Improvements in our automated systems helped us detect and revise content before it was reported by our community, which allowed us to automatically delete more than 80% of reported video in the second quarter of 2019 before having a single view.
- We are determined to continue reducing exposure to videos that violate our policies. That's why at Google, we have over 10,000 people in charge of detecting, reviewing, and deleting content that violates our rules.