Likes and dislikes don’t seem to make much difference to YouTube

VideoYouTube’s recommendation algorithms take little account of user opinion. Putting a “dislike”, deleting a video from the history, asking not to recommend a channel and other negative comments would not influence the suggestions that YouTube will then offer. This is at least what emerges from research conducted by Mozilla on a sample of 20,000 people, which you find in its 47 pages available on Google Drive (ironically): https://drive.google.com/ file/d/1FjoHblvuZxTw-dnkMGeMb-2brh_Vo9td/view In particular, “I don’t like” and “I don’t care” would be barely effective, avoiding only 12 and 11% of bad recommendations, respectively. Other types of comments, such as “don’t recommend the channel” or “remove from history” would arrive at 43% and 29% of the bad suggestions avoided, but still wouldn’t be useful to remove them completely. YouTube spokeswoman Elena Hernandez obviously disagrees with Mozilla’s findings and told The Verge that’s not how YouTube operates, on purpose.

It’s important to note that our controls do not filter out entire topics or viewpoints, as this could have negative effects for viewers, such as creating echo chambers. […] The Mozilla report does not take into account the actual operation of our systems and therefore it is difficult for us to draw much food for thought from it.

The “Dislike” removes a specific video, and the “Do not recommend channel” simply prevents that particular channel from being suggested in the future, but it does not prevent suggestions for other content related to the same topic, opinion, or the same speaker. The problem with recommendations, and how they seem largely unrelated to user opinion, is common not only to YouTube, but for better or worse to all suggestion-based social networks, from Instagram to TikTok. And while Hernandez’s objections have their own logic, Mozilla responds that it’s the different platforms that aren’t transparent to users about how their feedback is taken into account.

I believe that in the case of YouTube, the platform tries to balance user engagement with user satisfaction, which is ultimately a trade-off between recommended content that gets people to spend more time on the site and content that the algorithm thinks they will like. to users. The platform has the power to change which of these signals carries the most weight in its algorithm, but our research suggests that user feedback isn’t always the most important.

Yes, it is indeed a question of balancing, where each platform assigns the weights it wishes to the different feedback received from users. Too bad that the latter often do not have the slightest idea of ​​what these weights are.