it captures the sort of person who gets hooked on tvtropes and who first read LW by chasing hyperlink chains through the sequences at random. It comes off as wrong but in a way that seems somehow intentional, like there’s a thread of something that somehow makes sense of it, that makes the seemingly wrong parts all make sense, it’s just too cohesive but not cohesive enough otherwise, and then you go chasing all those hyperlinks over bolded words through endless glossary pages and anecdotes down this rabbit hole in an attempt to learn the hidden secrets of the multiverse and before you know what’s happened it’s come to dominate all of your thinking. And there is a lot of good content that is helpful mixed in with the bad content that’s harmful, which makes it all the harder to tell which is which.
the other thing that enabled it to get to me was that it was linked to me by someone inside the community who i trusted and who told me it was good content, so i kept trying to take it seriously even though my initial reaction to it was knee-jerk horror. Then later on others kept telling me it was important and that i needed to take it seriously so i kept pushing myself to engage with it until i started compulsively spiraling on it.
it captures the sort of person who gets hooked on tvtropes and who first read LW by chasing hyperlink chains through the sequences at random.
Hmm, no, I don’t think so.
I first read LW (well, it was OB at the time, but same deal) by chasing hyperlink chains through (what would come to be called) the Sequences at random. And I’ve read my share of TV Tropes. So this doesn’t check out.
Whatever the culprit quirk is, it’s clearly got nothing to do with whatever it is that makes people… read things by clicking on hyperlinks from other things.
the other thing that enabled it to get to me was that it was linked to me by someone inside the community who i trusted and who told me it was good content, so i kept trying to take it seriously even though my initial reaction to it was knee-jerk horror. Then later on others kept telling me it was important and that i needed to take it seriously so i kept pushing myself to engage with it until i started compulsively spiraling on it.
Hmm, I see. Would you say that the problem here was something like… too little confidence in your own intuition / too much willingness to trust other people’s assessment? Or something else?
(Did you eventually conclude that the person who recommended Ziz’s writings to you was… wrong? Crazy? Careless about what sorts of things to endorse? Something else?)
Hmm, I see. Would you say that the problem here was something like… too little confidence in your own intuition / too much willingness to trust other people’s assessment? Or something else?
that was definitely a large part of it, i let people sort of ‘epistemically bully’ me for a long time out of the belief that it was the virtuous and rationally correct thing to do. The first person who linked me sinceriously retracted her endorsements of it pretty quickly, but i had already sort of gotten hooked on the content at that point and had no one to actually help steer me out of it so i kept passively flirting with it over time. That was an exploitable hole, and someone eventually found it and exploited me using it for a while in a way that kept me further hooked into the content through this compulsive fear that ziz was wrong but also correct and going to win and that was bad so she had to be stopped.
Did you eventually conclude that the person who recommended Ziz’s writings to you was… wrong? Crazy? Careless about what sorts of things to endorse? Something else?
The person who kept me hooked on her writing for years was in a constant paranoia spiral about AI doom and was engaging with Ziz’s writing as obsessive-compulsive self-harm. They kept me doing that with them for a long time by insisting they had the one true rationality and if i didn’t like it i was just crazy and wrong and that i was lying to myself and that only by trying to be like them could the lightcone be saved from certain doom. I’m not sure what there is to eventually conclude from all of that, other than that it was mad unhealthy on multiple levels.
EDIT: the thing to conclude was that JD was grooming me
something like that. maybe it’d be worth adding that the LW corpus/HPMOR sort of primes you for this kind of mistake by attempting to align reason and passion as closely as possible, thus making ‘reasoning passionately’ an exploitable backdoor.
it captures the sort of person who gets hooked on tvtropes and who first read LW by chasing hyperlink chains through the sequences at random. It comes off as wrong but in a way that seems somehow intentional, like there’s a thread of something that somehow makes sense of it, that makes the seemingly wrong parts all make sense, it’s just too cohesive but not cohesive enough otherwise, and then you go chasing all those hyperlinks over bolded words through endless glossary pages and anecdotes down this rabbit hole in an attempt to learn the hidden secrets of the multiverse and before you know what’s happened it’s come to dominate all of your thinking. And there is a lot of good content that is helpful mixed in with the bad content that’s harmful, which makes it all the harder to tell which is which.
the other thing that enabled it to get to me was that it was linked to me by someone inside the community who i trusted and who told me it was good content, so i kept trying to take it seriously even though my initial reaction to it was knee-jerk horror. Then later on others kept telling me it was important and that i needed to take it seriously so i kept pushing myself to engage with it until i started compulsively spiraling on it.
Hmm, no, I don’t think so.
I first read LW (well, it was OB at the time, but same deal) by chasing hyperlink chains through (what would come to be called) the Sequences at random. And I’ve read my share of TV Tropes. So this doesn’t check out.
Whatever the culprit quirk is, it’s clearly got nothing to do with whatever it is that makes people… read things by clicking on hyperlinks from other things.
Hmm, I see. Would you say that the problem here was something like… too little confidence in your own intuition / too much willingness to trust other people’s assessment? Or something else?
(Did you eventually conclude that the person who recommended Ziz’s writings to you was… wrong? Crazy? Careless about what sorts of things to endorse? Something else?)
that was definitely a large part of it, i let people sort of ‘epistemically bully’ me for a long time out of the belief that it was the virtuous and rationally correct thing to do. The first person who linked me sinceriously retracted her endorsements of it pretty quickly, but i had already sort of gotten hooked on the content at that point and had no one to actually help steer me out of it so i kept passively flirting with it over time. That was an exploitable hole, and someone eventually found it and exploited me using it for a while in a way that kept me further hooked into the content through this compulsive fear that ziz was wrong but also correct and going to win and that was bad so she had to be stopped.
The person who kept me hooked on her writing for years was in a constant paranoia spiral about AI doom and was engaging with Ziz’s writing as obsessive-compulsive self-harm. They kept me doing that with them for a long time by insisting they had the one true rationality and if i didn’t like it i was just crazy and wrong and that i was lying to myself and that only by trying to be like them could the lightcone be saved from certain doom. I’m not sure what there is to eventually conclude from all of that, other than that it was mad unhealthy on multiple levels.
EDIT: the thing to conclude was that JD was grooming me
I see, thank you.
Insufficient defence of the passions against reason, then?
something like that. maybe it’d be worth adding that the LW corpus/HPMOR sort of primes you for this kind of mistake by attempting to align reason and passion as closely as possible, thus making ‘reasoning passionately’ an exploitable backdoor.