Oh, huh. I got the paper from this 80,000 hours episode, and thought I remembered the thesis of the episode (that social media algorithms are radicalizing people), and assumed the paper supported their thesis. Either I was wrong about the 80,000 hours episode’s conclusion, or the paper they linked doesn’t support their conclusion.
I think the radicalization conclusion was talked about in Human Compatible, but now I’m not too sure.
*I haven’t read it, maybe someone came to a different conclusion after reading it closely. Perhaps, the algorithm tends to push people a little bit towards reinforcing their beliefs. Or, it’s not the algorithm—people just search for stuff in ways that do that. I could also come up with a more complicated explanation—the algorithm points people towards ‘mainstream’ stuff more, but that tends to cover current events. Theory, the past (and the future), or just, more specific coverage might be done more by, if not smaller channels, then by people who know more. If someone has studied Marx are they more likely to be a fan?** Or does a little knowledge have more of an effect in that regard, and people who have studied more recognize more people that collectively had broad influence over time, and the nuance of their disagreements, and practice versus theory?
**If so, then when people look up his stuff on youtube, maybe they’re getting a different picture, and exposed to a different viewpoint.
Oh, huh. I got the paper from this 80,000 hours episode, and thought I remembered the thesis of the episode (that social media algorithms are radicalizing people), and assumed the paper supported their thesis. Either I was wrong about the 80,000 hours episode’s conclusion, or the paper they linked doesn’t support their conclusion.
I think the radicalization conclusion was talked about in Human Compatible, but now I’m not too sure.
Thanks for the correction!
If someone was to make the case that:
1) It used to radicalize people
2) And that it doesn’t now
then the paper appears to be an argument for 2.*
*I haven’t read it, maybe someone came to a different conclusion after reading it closely. Perhaps, the algorithm tends to push people a little bit towards reinforcing their beliefs. Or, it’s not the algorithm—people just search for stuff in ways that do that. I could also come up with a more complicated explanation—the algorithm points people towards ‘mainstream’ stuff more, but that tends to cover current events. Theory, the past (and the future), or just, more specific coverage might be done more by, if not smaller channels, then by people who know more. If someone has studied Marx are they more likely to be a fan?** Or does a little knowledge have more of an effect in that regard, and people who have studied more recognize more people that collectively had broad influence over time, and the nuance of their disagreements, and practice versus theory?
**If so, then when people look up his stuff on youtube, maybe they’re getting a different picture, and exposed to a different viewpoint.