It’s important to note that our controls do not filter out entire topics or viewpoints, as this could have negative effects for viewers, such as creating echo chambers. […] The Mozilla report does not take into account the actual operation of our systems and therefore it is difficult for us to draw much food for thought from it.
The “Dislike” removes a specific video, and the “Do not recommend channel” simply prevents that particular channel from being suggested in the future, but it does not prevent suggestions for other content related to the same topic, opinion, or the same speaker. The problem with recommendations, and how they seem largely unrelated to user opinion, is common not only to YouTube, but for better or worse to all suggestion-based social networks, from Instagram to TikTok. And while Hernandez’s objections have their own logic, Mozilla responds that it’s the different platforms that aren’t transparent to users about how their feedback is taken into account.
I believe that in the case of YouTube, the platform tries to balance user engagement with user satisfaction, which is ultimately a trade-off between recommended content that gets people to spend more time on the site and content that the algorithm thinks they will like. to users. The platform has the power to change which of these signals carries the most weight in its algorithm, but our research suggests that user feedback isn’t always the most important.
Yes, it is indeed a question of balancing, where each platform assigns the weights it wishes to the different feedback received from users. Too bad that the latter often do not have the slightest idea of what these weights are.