The role that YouTube and its behind-the-scenes recommendation algorithm plays in encouraging online radicalization has been suggested by both journalists and academics alike. This study directly quantifies these claims by examining the role that YouTube’s algorithm plays in suggesting radicalized content. After categorizing nearly 800 political channels, we were able to differentiate between political schemas in order to analyze the algorithm traffic flows out and between each group. After conducting a detailed analysis of recommendations received by each channel type, we refute the popular radicalization claims. To the contrary, these data suggest that YouTube’s recommendation algorithm actively discourages viewers from visiting radicalizing or extremist content. Instead, the algorithm is shown to favor mainstream media and cable news content over independent YouTube channels with slant towards left-leaning or politically neutral channels. Our study thus suggests that YouTube’s recommendation algorithm fails to promote inflammatory or radicalized content, as previously claimed by several outlets.
It looks like that paper doesn’t say that?
“After conducting a detailed analysis of recommendations received by each channel type, we refute the popular radicalization claims.”
Is there prior work showing that it did once have that effect?
Oh, huh. I got the paper from this 80,000 hours episode, and thought I remembered the thesis of the episode (that social media algorithms are radicalizing people), and assumed the paper supported their thesis. Either I was wrong about the 80,000 hours episode’s conclusion, or the paper they linked doesn’t support their conclusion.
I think the radicalization conclusion was talked about in Human Compatible, but now I’m not too sure.
*I haven’t read it, maybe someone came to a different conclusion after reading it closely. Perhaps, the algorithm tends to push people a little bit towards reinforcing their beliefs. Or, it’s not the algorithm—people just search for stuff in ways that do that. I could also come up with a more complicated explanation—the algorithm points people towards ‘mainstream’ stuff more, but that tends to cover current events. Theory, the past (and the future), or just, more specific coverage might be done more by, if not smaller channels, then by people who know more. If someone has studied Marx are they more likely to be a fan?** Or does a little knowledge have more of an effect in that regard, and people who have studied more recognize more people that collectively had broad influence over time, and the nuance of their disagreements, and practice versus theory?
**If so, then when people look up his stuff on youtube, maybe they’re getting a different picture, and exposed to a different viewpoint.
From the link:
[Submitted on 24 Dec 2019]
Algorithmic Extremism: Examining YouTube’s Rabbit Hole of Radicalization
Mark Ledwich, Anna Zaitsev
It looks like that paper doesn’t say that?
“After conducting a detailed analysis of recommendations received by each channel type, we refute the popular radicalization claims.”
Is there prior work showing that it did once have that effect?
Oh, huh. I got the paper from this 80,000 hours episode, and thought I remembered the thesis of the episode (that social media algorithms are radicalizing people), and assumed the paper supported their thesis. Either I was wrong about the 80,000 hours episode’s conclusion, or the paper they linked doesn’t support their conclusion.
I think the radicalization conclusion was talked about in Human Compatible, but now I’m not too sure.
Thanks for the correction!
If someone was to make the case that:
1) It used to radicalize people
2) And that it doesn’t now
then the paper appears to be an argument for 2.*
*I haven’t read it, maybe someone came to a different conclusion after reading it closely. Perhaps, the algorithm tends to push people a little bit towards reinforcing their beliefs. Or, it’s not the algorithm—people just search for stuff in ways that do that. I could also come up with a more complicated explanation—the algorithm points people towards ‘mainstream’ stuff more, but that tends to cover current events. Theory, the past (and the future), or just, more specific coverage might be done more by, if not smaller channels, then by people who know more. If someone has studied Marx are they more likely to be a fan?** Or does a little knowledge have more of an effect in that regard, and people who have studied more recognize more people that collectively had broad influence over time, and the nuance of their disagreements, and practice versus theory?
**If so, then when people look up his stuff on youtube, maybe they’re getting a different picture, and exposed to a different viewpoint.