A recent crowdsourced study by Mozilla found out that YouTube’s recommendations push harmful videos to the viewers. The algorithm used by the company displays videos that are against YouTube’s policies.
The research was conducted at the University of Exeter, where researchers appointed more than 37,000 volunteers to keep track of the recommendation section of YouTube through a browser extension.
After a ten month-long data gathering process, researchers scrutinized acquired data and found that YouTube is pushing fake news, violent, and sexual content to the viewers. The research team was led by Brandi Guerkink, senior manager of Advocacy at Mozilla.
While addressing, he mentioned that most of the recommendations were related to fake news, but certain videos contained sexualized parodies of cartoons and racist content. She said, “When it’s actively suggesting that people watch content that violates the company’s policies, the algorithm seems to be working at odds with the platform’s stated aims, their own community guidelines, and the goal of making the platform a safe place for people.”
Further research found out that non-English speaking countries are at a higher risk of receiving such recommendations accounting for more than 60% of the recommended videos. In the past, YouTube has been accused of the same, which forced the company to change its algorithm. But now, it is quite evident that YouTube’s efforts have not been successful.
The video streaming platform, in a statement, said, “We constantly work to improve the experience on YouTube, and over the past year alone, we’ve launched over 30 different changes to reduce recommendations of harmful content.”
Experts recommend that YouTube should introduce an option for viewers to disable personalized recommendations. Guerkink expressed her regents by saying, “We can’t just continue to have this paradigm where researchers raise issues, companies say ‘OK, it’s solved,’ and we go on with our lives.”