YouTube’s recommendation system is criticized as harmful. Mozilla wants to research it

google-hq-sede-mountain-view.jpg

Mozilla wants to research YouTube’s recommendation rabbit hole.


Seth Rosenblatt/CNET

YouTube’s video recommendation system has been repeatedly accused by critics of sending people down rabbit holes of disinformation and extremism. Now Mozilla, the nonprofit that makes the Firefox browser, wants YouTube’s users to help it research how the controversial algorithms work.  

Mozilla on Thursday announced a project that asks people to download a software tool that gives Mozilla’s researchers information on what video recommendations people are receiving on the Google-owned platform. 

YouTube’s algorithms recommend videos in the “What’s next” column along the right side of the screen, inside the video player after the content has ended, or on the site’s homepage. Each recommendation is tailored to the person watching, taking into account things like their watch history, list of channel subscriptions or location. The recommendations can be benign, like another live performance from the band you’re watching. But critics say YouTube’s recommendations can also lead viewers to fringe content, like medical misinformation or conspiracy theories. 

Mozilla’s project comes as YouTube, which sees more than 2 billion users a month, already contends with viral toxic content. Earlier this year, the company struggled to contain shares of the Plandemic, a video that spread false information about COVID-19. YouTube and other platforms have also drawn blowback for helping to spread the QAnon conspiracy theory, which baselessly alleges that a group of “deep state” actors, including cannibals and pedophiles, are trying to bring down President Donald Trump. The stakes will continue to rise over the coming weeks, as Americans seek information online ahead of the US presidential election.

“Despite the serious consequences, YouTube’s recommendation algorithm is entirely mysterious to its users,” Ashley Boyd, vice president of advocacy and engagement at Mozilla, said in a blog post. “What will YouTube be recommending that users in the US watch in the last days before the election? Or in the following days, when the election results may not be clear?” 

To take part in Mozilla’s project, people will need to install an “extension,” a type of software tool, for Firefox or Chrome, the browser Google makes. The tool, called the Regret Reporter, will let people flag videos they deem harmful and send the information to Mozilla’s researchers. 

People can also add a written report that says what recommendations led to the video and mention anything else they think is relevant. The extension will also automatically collect data on how much time a person is spending on YouTube. Mozilla said it hopes to gain insights about what patterns of behavior lead to problematic recommendations.

Asked about the project, YouTube said it takes issue with Mozilla’s methodology.

“We are always interested to see research on these systems and exploring ways to partner with researchers even more closely,” Farshad Shadloo, a YouTube spokesman, said in a statement. “However it’s hard to draw broad conclusions from anecdotal examples and we update our recommendations systems on an ongoing basis, to improve the experience for users.” 

He said YouTube has made more than 30 policy and enforcement updates to the recommendation system in the past year. The company has also cracked down on medical misinformation and conspiracy content. 

Mozilla has scrutinized YouTube’s algorithms in the past. Last year, the organization awarded a fellowship to Guillaume Chaslot, a former YouTube engineer and outspoken critic of the company, to support his research on the platform’s artificial intelligence systems. In July, Mozilla unveiled a project it funded called “TheirTube,” which lets people see how YouTube’s recommendations could look for people with various ideological views. 

Source Article