Yeah. It makes sense in retrospect that Eliezer’s writings full of weighty meaning would attract lots of people with a “meaning-shaped hole”. I wish we’d kept to fun puzzles about decision theory, evolution etc.
But Unity of Knowledge! The fun puzzles for some are exactly what has pushed others off the deep end.
Another 20-some Insanity Wolf memes added to the collection. I appear to be immune to basilisks. I can distantly appreciate the structure of Roko’s Basilisk, but I saw nothing but incoherent rambling in Pasek’s Doom. I scoff at neurobabble and suspect anyone talking about “mental tech” to be infected with scientology. My faith, standing on a solid foundation of doubt, is a stout defence of my reason against vampires.
Perhaps not believing in total preference orderings or utility theory as the fundamental ground of motivation has something to do with that.
I scoff at neurobabble and suspect anyone talking about “mental tech” to be infected with scientology.
Scientology is a particular brand of techniques that comes with the belief that anything not invented by Hubbard doesn’t work. Scientology itself has a bunch of techniques that do have effects and mess up many people.
One way to deal with that is to categorically reject the field. That works for many people but others feel like experimenting around. If I would meet a rationalist who gets into contact with scientology (maybe because he read LukeProgs endorsement of Scientology 101 on LessWrong) it’s worth to be articulate the dangers and ways it messes up people in more detail.
Recently, a rationalist wrote on Facebook about considering meditation to be dangerous in general and given outcomes like Pasek or Eric Bruylant noticing some danger makes sense.
Both cases share similarities beyond just meditation. Both of them include taking substances (even when in the case of Pasek the hormones were legal), a lot of meditation and personality splits. In both cases most of what they did was autodicatic and not learned from experienced teachers.
If I look at other similarities that both share is that they were intent to create rationalist communities outside of existing hubs.
Hopefully it’s not too late to try to keep the focus on the fun puzzles, etc! There really does seem to be an alarming amount of craziness floating around LW, along with the constant weird attempts to explicitly model things that we evolved to understand instinctively (eg most aspects of social interaction). Reading that stuff slightly negatively affected my mental health despite thinking it was mostly silly—to the extent it’s taken seriously it seems like it could have more substantial negative effects.
Yeah. It makes sense in retrospect that Eliezer’s writings full of weighty meaning would attract lots of people with a “meaning-shaped hole”. I wish we’d kept to fun puzzles about decision theory, evolution etc.
But Unity of Knowledge! The fun puzzles for some are exactly what has pushed others off the deep end.
Another 20-some Insanity Wolf memes added to the collection. I appear to be immune to basilisks. I can distantly appreciate the structure of Roko’s Basilisk, but I saw nothing but incoherent rambling in Pasek’s Doom. I scoff at neurobabble and suspect anyone talking about “mental tech” to be infected with scientology. My faith, standing on a solid foundation of doubt, is a stout defence of my reason against vampires.
Perhaps not believing in total preference orderings or utility theory as the fundamental ground of motivation has something to do with that.
See also.
Scientology is a particular brand of techniques that comes with the belief that anything not invented by Hubbard doesn’t work. Scientology itself has a bunch of techniques that do have effects and mess up many people.
One way to deal with that is to categorically reject the field. That works for many people but others feel like experimenting around. If I would meet a rationalist who gets into contact with scientology (maybe because he read LukeProgs endorsement of Scientology 101 on LessWrong) it’s worth to be articulate the dangers and ways it messes up people in more detail.
Recently, a rationalist wrote on Facebook about considering meditation to be dangerous in general and given outcomes like Pasek or Eric Bruylant noticing some danger makes sense.
Both cases share similarities beyond just meditation. Both of them include taking substances (even when in the case of Pasek the hormones were legal), a lot of meditation and personality splits. In both cases most of what they did was autodicatic and not learned from experienced teachers.
If I look at other similarities that both share is that they were intent to create rationalist communities outside of existing hubs.
Hopefully it’s not too late to try to keep the focus on the fun puzzles, etc! There really does seem to be an alarming amount of craziness floating around LW, along with the constant weird attempts to explicitly model things that we evolved to understand instinctively (eg most aspects of social interaction). Reading that stuff slightly negatively affected my mental health despite thinking it was mostly silly—to the extent it’s taken seriously it seems like it could have more substantial negative effects.