Mozilla researchers analyzed seven months of YouTube activity from more than 20,000 participants to assess four ways YouTube says people can “adjust their recommendations”: I do not like it, I’m not interested, Delete from historyor Do not recommend this channel. They wanted to see how effective these controls really are.
Each participant installed a browser extension that added a Stop recommending button at the top of every YouTube video they watched, in addition to those in the sidebar. Hitting it triggered one of the algorithm’s four adjustment responses each time.
Then dozens of research assistants looked at the rejected videos to see how they resembled tens of thousands of subsequent YouTube recommendations to the same users. They found that YouTube’s controls have a “negligible” effect on the recommendations participants received. Over the seven months, a rejected video generated, on average, about 115 bad recommendations—videos that were very similar to the ones participants had already told YouTube they didn’t want to watch.
Previous research suggests that YouTube’s practice of recommending videos you’re likely to agree with and rewarding controversial content can harden people’s views and lead to political radicalization. The platform has also been criticized on several occasions for promoting sexually explicit or suggestive videos of children, making content that violated its own policies go viral. Following the scrutiny, YouTube has pledged to crack down on hate speech, better enforce its guidelines and not use its recommendation algorithm to promote “threatened” content.
However, the study found that content that appeared to violate YouTube’s policies was still actively recommended to users even after they had submitted negative comments.
hitting I do not like itthe most visible form of providing negative feedback, stops only 12% of bad recommendations; I’m not interested it stops just 11%. YouTube advertises both options as ways to adjust its algorithm.
YouTube spokeswoman Elena Hernández says: “Our controls do not filter out entire topics or viewpoints, as this could have negative effects for viewers, such as creating echo chambers.” Hernandez also says Mozilla’s report doesn’t take into account how YouTube’s algorithm actually works. But that’s something no one outside of YouTube really knows, given the algorithm’s billions of entries and the company’s limited transparency. The Mozilla studio tries to look into this black box to better understand its outputs.