Mozilla’s new project, “TheirTube,” is offering a glance at theoretical YouTube homepages for users in six different categories — fruitarian, doomsday prepper, liberal, conservative, conspiracist and climate denier. Through these different personas, Mozilla…

Mozilla sums up the types of videos YouTube’s recommendation algorithm is likely to suggest for each persona. The fruitarian will see videos showing “how to have a hardcore organic life.” Videos for the prepper will “explore apocalyptic scenarios and how to “prepare” for them.” The liberal is recommended videos that “tend to support notions like feminism and multiculturalism” — while the conservative will see videos criticizing these ideologies. YouTube will suggest the conspiracist videos that “suggest global events are in fact conspiracies.” Finally, the climate denier will see videos that attempt to “‘debunk’ scientific evidence about global warming.”
The suggested videos aren’t always necessarily fake or misleading. But Mozilla notes that the YouTube algorithm is designed to amplify content that will keep us clicking, even if that content is “radical or flat out wrong.” 
“This project raises questions like: What if platforms were more open about the recommendation bubbles they created? And: By experiencing other users’ bubbles, is it possible to gain a better perspective of your own recommendation environment?” said Tomo Kihara, creator of TheirTube.
The YouTube recommendation algorithm accounts for 70 percent of videos watched on the site, according to Mozilla. The algorithm has long been criticized for recommending conspiracy videos rife with misinformation and videos featuring minors, the latter of which prompted Senator Josh Hawley to propose legislation requiring YouTube to fix the problem. YouTube responded by pledging last year to not recommend “borderline” videos that come close to violating community guidelines or those which “misinform users in a harmful way.” However, as TheirTube is trying to demonstrate, YouTube bubbles still occur.