How much of this ability is needed in order to avoid taking strong mystical experiences at face value?
Not sure how to quantify this. I also haven’t had a mystical experience myself, although I have experienced mildly altered states of consciousness without the use of drugs. (Which is not at all unique to dabbling in mysticism; you can also get them from concerts, sporting events, etc.) I imagine it’s comparable to the amount of ability needed to avoid taking a strong drug experience at face value while having it (esp. since psychoactive drugs can induce mystical experiences).
In the comment I was replying to, you were saying that some rationalists are being too risk-averse. It seems like you’re now backing off a bit and just talking about yourself?
I want to make a distinction between telling people what trade-offs I think they should be making (which I mostly can’t do accurately, because they have way more information than I do about that) and telling people I think the trade-offs they’re making are too extreme (based on my limited information about them + priors). E.g. I can’t tell you how much your time is worth in terms of money, but if I see you taking on jobs that pay a dollar an hour I do feel justified in claiming that probably you can get a better deal than that.
I’m worried that the epistemic risks get stronger the further you go down this path.
Yes, this is probably true. I don’t think you need to go very far in the mystical direction per se to get the benefits I want rationalists to get. Again, it’s more that I think there are some important skills that it’s worth it for rationalists to learn, and as far as I can tell the current experts in those skills are people who sometimes use vaguely mystical language (as distinct from full-blown mystics; these people are e.g. life coaches or therapists, professionally). So I want there to not be a meme in the rationality community along the lines of “people who use mystical language are crazy and we have nothing to learn from them,” because I think people would be seriously missing out if they thought that.
We do have empirical evidence about how strong these risks are though
That’s not clear to me because of blindspots. Consider the Sequences, for example: I think we can agree that they’re in some sense psychoactive, in that people really do change after reading them. What kind of epistemic risks did we take on by doing that? It’s unclear whether we can accurately answer that question because we’ve all been selected for thinking that the Sequences are great, so we might have shared blindspots as a result. I can tell a plausible story where reading the Sequences makes your life worse in expectation, in exchange for slightly increasing your chances of saving the world.
Similarly we all grow up in a stew of culture informed by various kinds of TV, movies, etc. and whatever epistemic risks are contained in those might be hidden behind blindspots we all share too. This is one of the things I interpret The Last Psychiatrist to have been saying.
Not sure how to quantify this. I also haven’t had a mystical experience myself, although I have experienced mildly altered states of consciousness without the use of drugs. (Which is not at all unique to dabbling in mysticism; you can also get them from concerts, sporting events, etc.) I imagine it’s comparable to the amount of ability needed to avoid taking a strong drug experience at face value while having it (esp. since psychoactive drugs can induce mystical experiences).
I want to make a distinction between telling people what trade-offs I think they should be making (which I mostly can’t do accurately, because they have way more information than I do about that) and telling people I think the trade-offs they’re making are too extreme (based on my limited information about them + priors). E.g. I can’t tell you how much your time is worth in terms of money, but if I see you taking on jobs that pay a dollar an hour I do feel justified in claiming that probably you can get a better deal than that.
Yes, this is probably true. I don’t think you need to go very far in the mystical direction per se to get the benefits I want rationalists to get. Again, it’s more that I think there are some important skills that it’s worth it for rationalists to learn, and as far as I can tell the current experts in those skills are people who sometimes use vaguely mystical language (as distinct from full-blown mystics; these people are e.g. life coaches or therapists, professionally). So I want there to not be a meme in the rationality community along the lines of “people who use mystical language are crazy and we have nothing to learn from them,” because I think people would be seriously missing out if they thought that.
That’s not clear to me because of blindspots. Consider the Sequences, for example: I think we can agree that they’re in some sense psychoactive, in that people really do change after reading them. What kind of epistemic risks did we take on by doing that? It’s unclear whether we can accurately answer that question because we’ve all been selected for thinking that the Sequences are great, so we might have shared blindspots as a result. I can tell a plausible story where reading the Sequences makes your life worse in expectation, in exchange for slightly increasing your chances of saving the world.
Similarly we all grow up in a stew of culture informed by various kinds of TV, movies, etc. and whatever epistemic risks are contained in those might be hidden behind blindspots we all share too. This is one of the things I interpret The Last Psychiatrist to have been saying.