Mozilla Accuses YouTube Of Pushing Violent, Hateful Content
A new study by Mozilla has found that YouTube continues to recommend hateful content to users and especially in non-English speaking countries.
![](https://static2.srcdn.com/wordpress/wp-content/uploads/2021/07/YouTube-Flag-Algorithm.jpg)
Mozilla - the company behind the web browser Firefox - is going after YouTube, claiming Google is letting the service push harmful videos and suggesting content that's filled with misinformation, hate speech, and violence. Though YouTube's algorithms aren't perfect, this suggests there is a lot of content that is leaking through the cracks and going unnoticed by the popular video-sharing platform. Needless to say, these types of videos shouldn't be publicly available, especially to a ranging audience of people.
YouTube, like a lot of social networks, has a monitoring system for videos that are less than PG. Obviously, there is a range of content allowed on YouTube, but sometimes users try to post things that are over the top, or content that goes directly against YouTube's user guidelines. An example of these videos would be content that includes hate speech, or graphically violent and disturbing scenes. Normally, if users see this content they can flag it as inappropriate. However, YouTube has its own responsibility of running videos through an algorithm that rejects content that doesn't fit the policies it has in place.
Mozilla thinks Google has dropped the ball. The issue isn't so much that these videos are leaking through the cracks as much as they're being pushed through them by YouTube's recommendation system. Using a browser extension called RegretsReport, Mozilla was able to gather data from users willingly that described what was showing up on their YouTube recommended page - usually the first thing a user sees when the video app is opened. Users reported that they were recommended videos with subjects like COVID-19 fear-mongering to political misinformation and all the way to extremely inappropriate children's cartoons. In non-English speaking countries, the likelihood these videos were pushed was 60-percent higher than in English-speaking countries. The reason for this doesn't seem to be specified, but it's interesting nonetheless.
![](https://static1.srcdn.com/wordpress/wp-content/uploads/2020/08/YouTube-AI-Policy-Enforcer.jpg)
The rate that people consume recommended videos is staggering. According to a statement by YouTube given to NBC News, 200 million views a day are thanks to the recommended page. This is due to a couple of obvious reasons. First, YouTube's algorithm knows what's trending among viewers and will push those videos. Also, a video's popularity can thrive on controversy which, in turn, can increase views as well. The algorithm is definitely broken and according to YouTube, it's doing what it can to fix it. "We constantly work to improve the experience on YouTube and over the past year alone, we've launched over 30 different changes to reduce recommendations of harmful content," stated YouTube. While this seems a little absent from the company, it shows that it's aware of the situation and how it is affecting users.
A user should flag a video if they don't think it should be available to be public. While the algorithm may think it knows what to recommend, there will never be a perfect system, at least for now. In the meantime, the latest study by Mozilla will hopefully push YouTube even harder to change how it operates at the recommendation level.