- Advertisement -
Lifestyle

YouTube keeps recommending ‘regrettable’ videos to users

Researchers found that users in non-English-speaking countries were recommended 60% more 'regrettable' videos.

Staff Writers
2 minute read
Share
YouTube has been criticised for providing minimal evidence or data to back up its claims of detoxifying its 'serving up' algorithm. Photo: AFP
YouTube has been criticised for providing minimal evidence or data to back up its claims of detoxifying its 'serving up' algorithm. Photo: AFP

YouTube users have reported objectionable content in thousands of videos recommended to them by the platform itself, according to research by the Mozilla Foundation.

The findings, released on Wednesday, revealed many examples of YouTube recommending videos that other users had marked as “regrettable” – a broad category  that includes misinformation, violence and hate speech.

The 10-month-long investigation used crowdsourced data gathered by Mozilla using an extension for its Firefox web browser and an extension created for Chrome users to report potentially problematic content.

Mozilla gathered 3,362 reports submitted by 1,622 unique contributors coming from 91 countries between July 2020 and June of this year.

The foundation then hired 41 researchers from the University of Exeter in England to review the submissions and determine if they thought videos should be on YouTube and, if not, what platform guidelines they may violate.

One surprise finding was that 71% of videos flagged by users as regrettable came from YouTube’s own recommendations on their screens in sidebars and Watch Next screens.

Such videos also tended to be much more popular than others viewed by the volunteers, suggesting the company’s algorithm actually favoured objectionable content.

Nearly 10% of the regrettable videos, which Mozilla said accumulated 160 million views, were later pulled by YouTube for violating platform policy.

“YouTube needs to admit their algorithm is designed in a way that harms and misinforms people,” said Brandi Geurkink, senior advocacy manager at Mozilla.

The research also found users in non-English-speaking countries were exposed to regrettable videos at a 60% higher rate, with the highest ratio in Brazil, Germany, and France.

A spokesman for YouTube told The Hill that they constantly work to improve user experience, saying, “In the past year alone, we’ve launched over 30 different changes to reduce recommendations of harmful content.”

YouTube’s algorithm, which it uses to recommend hundreds of millions of hours of videos to users every day, is a notorious black box that researchers have so far been unable to access.

YouTube in 2019 said it made a series of 30 unnamed tweaks to its recommendations system for users in the US that reduced watch time of borderline content – videos which toe the line between acceptable and violating platform policy – by 70% among non-subscribers.

In 2021, for the first time it disclosed its “violative view rate”, or the percentage of views that come from content that breaches community guidelines, at just over 0.1%.

Mozilla’s report calls on the platform to let researchers audit its recommendation algorithm and provide users a way to opt out of personalised suggestions.

It also urges lawmakers to step in to compel a base level of transparency given how secretive platforms can be of their technology.

Some commentators say that viewers themselves are to blame as their “click history” drives the YouTube algorithm.

At least one billion hours of videos are watched daily on YouTube, the company has claimed.

- Advertisement -

Most Read

No articles found.