This is part of the problem though, I don’t think all of those things are crazy, and some of them seem to follow from standard LW and EA axioms.
start believing in demons and try to quarantine their friends from memetic viruses...threaten people… plan terrorist attacks
Sure, those are really bad.
start having anxiety attacks about Roko’s basilisk all the time
Having anxiety attacks about things is pretty universally unhelpful. But if you’re using Roko’s basilisk as a shorthand for all of the problems of “AIs carrying out threats”, including near term AIs in a multipolar setting, then it seems perfectly reasonable to be anxious about that. Labeling people crazy for being scared is inaccurate if the thing they’re fearing is actually real+scary.
claim strongly that all live is better off being exterminated because all existence is suffering
Again, this depends on what you mean. I think if you take the EA worldview seriously then the obvious conclusion is that Earth life up until and including today has been net-negative because of animal suffering. My guess is also that most current paths through AI run an unacceptable risk of humans being completely subjugated by either some tech-government coalition or hitting some awful near miss section of mind space. Do either of those beliefs make me crazy?
Having anxiety attacks about things is pretty universally unhelpful. But if you’re using Roko’s basilisk as a shorthand for all of the problems of “AIs carrying out threats”, then it seems perfectly reasonable to be anxious about that. Labeling people crazy seems misguided if the thing they’re fearing is actually scary.
Agree that being anxious is totally fine, but the LW team has to deal with a relatively ongoing stream of people (like 3-4 a year) who really seem to freak out a lot about either Roko’s basilisk or quantum immortality/suicide. To be clear, these are interesting ideas, but usually when we deal with these people they are clearly not in a good spot.
Again, this depends on what you mean. I think if you take the EA worldview seriously then the obvious conclusion is that Earth life up until and including today has been net-negative because of animal suffering.
Net-negative I think is quite different from “all existence is suffering”. But also, yeah, I do think that the reason why I’ve encountered a lot of this kind of craziness in the EA/Rationality space is because we discuss a lot of ideas with really big implications, and have a lot of people who take ideas really seriously, which increases the degree to which people do go crazy.
My guess is you are dealing fine with these ideas, though some people are not, which is sad, but also does mean I just encounter a pretty high density of people I feel justified in calling crazy.
I think if you take the EA worldview seriously then the obvious conclusion is that Earth life up until and including today has been net-negative because of animal suffering.
Nit: I don’t consider “the EA worldview” to have any opinion on animal suffering. But (roughly speaking) I agree you can get this conclusion from the EA worldview plus some other stuff which is also common among EAs.
This is part of the problem though, I don’t think all of those things are crazy, and some of them seem to follow from standard LW and EA axioms.
Sure, those are really bad.
Having anxiety attacks about things is pretty universally unhelpful. But if you’re using Roko’s basilisk as a shorthand for all of the problems of “AIs carrying out threats”, including near term AIs in a multipolar setting, then it seems perfectly reasonable to be anxious about that. Labeling people crazy for being scared is inaccurate if the thing they’re fearing is actually real+scary.
Again, this depends on what you mean. I think if you take the EA worldview seriously then the obvious conclusion is that Earth life up until and including today has been net-negative because of animal suffering. My guess is also that most current paths through AI run an unacceptable risk of humans being completely subjugated by either some tech-government coalition or hitting some awful near miss section of mind space. Do either of those beliefs make me crazy?
Agree that being anxious is totally fine, but the LW team has to deal with a relatively ongoing stream of people (like 3-4 a year) who really seem to freak out a lot about either Roko’s basilisk or quantum immortality/suicide. To be clear, these are interesting ideas, but usually when we deal with these people they are clearly not in a good spot.
Net-negative I think is quite different from “all existence is suffering”. But also, yeah, I do think that the reason why I’ve encountered a lot of this kind of craziness in the EA/Rationality space is because we discuss a lot of ideas with really big implications, and have a lot of people who take ideas really seriously, which increases the degree to which people do go crazy.
My guess is you are dealing fine with these ideas, though some people are not, which is sad, but also does mean I just encounter a pretty high density of people I feel justified in calling crazy.
Nit: I don’t consider “the EA worldview” to have any opinion on animal suffering. But (roughly speaking) I agree you can get this conclusion from the EA worldview plus some other stuff which is also common among EAs.