I think I might just commit to staying away from LSD and Mind Illuminated style meditation entirely. Judging by the frequency of word of mouth accounts like this, the chance of going a little or a lot insane while exposed to them seems frighteningly high.
I wonder why these long term effects seem relatively sparsely documented. Maybe you have to take the meditation really seriously and practice diligently for this stuff to have a high chance of happening, and people in this community do that often, but the average study population doesn’t?
There can also be factors in this community that make people both unusually likely to go insane and to also try things like meditation and LSD in an attempt to help themselves. It’s a bit hard to say given that the post is so vague on what exactly “insanity” means, but the examples of acausal trade etc. make me suspect that it’s related to a specific kind of anxiety which seems to be common in the community.
That same kind of anxiety also made me (temporarily) go very slightly crazy many years ago, when I learned about quantum mechanics (and I had neither done psychedelics nor had I yet started meditating at the time), and it feels like the same kind of thing that causes the occasional person to freak out about Roko’s Basilisk. I think those kinds of people are particularly likely to be drawn to LW, because they subconsciously see rationality as a way to try to control their anxiety, and that same thing causes them to seek out psychedelics and meditation. And then rationality, meditation, and psychedelics are all things that might also dismantle some of the existing defenses their mind has against that anxiety.
I suspect it’s related to the fact that we’ve gotten ourselves off-distribution from the emergencies that used to be common, and thus AI and the Singularity are interpreted as immediate emergencies when they aren’t.
I’ll also make a remark that LW focuses on the tails, so things tend to be more extreme than usual.
Yeah, I think people who are high in abstract thinking and believing their beliefs and anxious thought patterns should really stay away from psychedelics and from leaning too hard into their run-away thought trains.
Also, try to stay grounded with people and activities that don’t send you off into abstract thought space. Spend some time with calm normal people who look at the world in straightforward ways, not only creative wild thinkers. Spend time doing hobbies outdoors that use your physical body and attention in satisfying ways, keeping you engaged enough to stay out of your head.
people who are high in abstract thinking and believing their beliefs and anxious thought patterns
I think one who this description fits can avoid any risks of ‘going insane’ while still using their abilities for good. For example, in my own case (I think the first two describe me, and the third one sort-of does), if I were to apply these suggestions..
try to stay grounded with people and activities that don’t send you off into abstract thought space. Spend some time with calm normal people who look at the world in straightforward ways, not only creative wild thinkers. Spend time doing hobbies outdoors that use your physical body and attention in satisfying ways, keeping you engaged enough to stay out of your head.
then my creative output related to alignment would probably drop significantly.
(I agree with not trying psychedelics though. Even e.g nootropics and adhd meds are things I’m really cautious with, cause I don’t wanna mess up some part of my process.)
For anyone reading this post in the future, I’d instead suggest doing things meant to help you channel your ability: being conscious and reflective about your thoughts, revisiting basic rationality techniques and theory occasionally, noticing privileged hypotheses (while still allowing yourself to ponder them if you’re just doing it because you find it interesting; I think letting ones mind explore is also important to generating important ideas and making connections).
“Please don’t throw your mind away” in this other sense of counteracting your tendency to think abstractly; you might be able to do a lot of good with it.
I think your suggestions are good as well. To be clear: I didn’t mean that I think one should spend a large fraction of their time just ‘staying grounded’. More like, a few hours a week.
The way I model attention is that it is (metaphorically) a Cirrus (biology) of thought that you extend into the world and then retract into your mind. If you leave it out for too long, it gets tangled up in the forest of all knowledge, if you keep it inside for too long, then you become unable to respond to your environment.
People who are extremely online tend to send their attention cirrus into the internet, where it is prone to become a host to memes that use addiction to bypass your mind’s typical defenses against infection.
Anything that you really enjoy to the point of losing self-control comes under the category of being a disease: whether that’s social media, programming, fiction, gaming, tentacle pornography, research, or anime.
Even if they were somehow extremely beneficial normally (which is fairly unlikely), any significant risk of going insane seems much too high. I would posit they have such a risk for exactly the same reason -when using them, you are deliberately routing around very fundamental safety features of your mind.
The MBSR studies are two-month interventions. They are not going to have the same strong effects as people meditating seriously for years.
On the other hand, those studies that investigate people who meditate a lot are often from a monastic setting where people have teachers which is quite different from someone meditating without a teacher and orienting themselves with the Mind Illuminated.
Maybe meditation moves people in a random direction. Those who get hurt, mostly stop meditating, so you won’t find many of them in the “meditating seriously for years” group.
I find it ironic that in a community that values clear thinking many people do things with their brains similar to giving their computer a hard kick or setting it on fire and expecting that to improve its performance.
It’s the fucking impulsive contrarianism, isn’t it? The more people keep telling you about those who have ruined their lives by taking drugs, the more certain you feel that you will do it the right way that will magically give you mental superpowers, unlike all those idiots who were simply doing it wrong. Induction is for losers. Also, you are smarter than everyone, and you did your “reseach” on internet, or asked a friend.
I wonder why these long term effects seem relatively sparsely documented.
I think the long-term effects of LSD and other drugs are documented sufficiently. It’s just, if there are 100 boring statistics about people who fucked up their lives, and 1 exciting speculative book by Timothy Leary, everyone will talk about the latter.
Ah, you meant meditation. I guess the standard excuse is that people who got hurt by meditation were either doing it wrong, or they had some extremely rare pre-existing condition. (Translated: no matter how many people get hurt by doing X, it obviously does not apply to me. Because they were stupid and they were doing it wrong, and I am smart and I will be doing it right. Also, they were weak, and I am invulnerable.) The same excuse applies to people who join an MLM pyramid scheme and lose their money, or people who pray to God to cure their sick child and then the child dies anyway. The theory is always correct; if it doesn’t work for you, you were clearly applying it incorrectly.
How many more people have to die, before we learn the thing that a random 10 years old kid could tell us?
I think that before you write strongly-worded comments accusing people of being idiots for privileging anecdotes more heavily than statistics, you should first establish that the side you’re taking is actually the one supported by statistics and that it’s the other side which is relying on anecdotes, and not vice versa.
My read is that for meditation and psychedelics, the actual research tends to show that they are generally low-risk/beneficial (even to the point of their mental health benefits starting to gradually overcome the stigma against psychedelics among academic researchers) [e.g. 1, 2, 3 for psychedelics] and it’s actually the bad cases that are the unrepresentative anecdotes.
it’s actually the bad cases that are the unrepresentative anecdotes.
How unrepresentative? What probability of becoming the “bad case” would you consider acceptable?
If there is a hypothetical number of bad cases that would make you change your mind if all those bad cases happened inside the rationality community, how big approximately would that number be?
What probability of becoming the “bad case” would you consider acceptable?
Acceptable for writing highly derisive comments about people who try psychedelics? I’m not really a fan of that approach in any case, tbh.
Acceptable for psychedelics being worth trying? I don’t know, that seems like it would depend on the person’s risk tolerance and what they’re hoping to get out of it. I don’t consider it my business to decide e.g. what level of risk is unacceptable if someone wants to try extreme sports, nor do I consider it my business to tell people at what risk level they are allowed to try out psychedelics.
I’m more in favor of talking about the possible risks honestly and openly but without exaggeration, and also talking about responsible use, how to ameliorate the risks, and what the possible risk factors are.
The point of Kaj Sotala’s comment is that there is a selection bias that is severe enough that your comments need to have major caveats to them (deepthoughtlife made a similar error.) I won’t determine your risk tolerance for medicine, but what I can say is that we should update in the opposite direction: That psychedelics are safe and maybe useful for the vast majority of people, and the ones that were truly harmed are paraded as anecdotes, showing massive selection biases, and not representing the median person in the world.
Suppose you start taking LSD. Not as a part of a scientific experiment where the dosage was reviewed and approved by a research ethics board, but based on a recommendation of your friend and an internet research you did yourself, using doses as big as your friend/research recommends, repeating as often as your friend/research considers safe.
(Maybe, let’s also include the risk of self-modification, e.g. the probability that once you overcome the taboo and find the results of the experiment appealing, you may be tempted to try a greater dose the next time, or increase the frequency. I am mentioning this, because—yes, anecdotally—people experimenting with psychoactive substances sometimes do exactly this.)
Are you saying that the probability of serious and irreversible harm to your brain is smaller than 1%?
Or are you saying that the potential benefits are so large, that the 1% chance of seriously and irreversibly harming your brain is totally worth it?
I think that at least one of these two statements needs to be true, in order to make experimenting with LSD worth it. I just don’t know which one (or possibly both?) are you making.
Note that the 1% probability of hurting your brain (heck, even 40% probability) is still hypothetically compatible with the statement that for a median person the experiment is a net benefit.
I suspect a large part of the problem is LW is trying to find answers to a field that has no reliable results, improving minds (aka nootropics.) We know that existing drugs at most give you emotional health, and even here there are some limits to that. So it’s not surprising that attempts to improve minds fail a lot presently.
I would place a 80-90% prior probability that the boring answer is correct, that the brain is a complicated mess that is hard to affect in non-genetic ways. That stated, even if genetics can figure out how to improve intelligence, there’s a further problem in that people would figuratively riot because of equality memes that say that intelligence doesn’t matter and that anyone can become talented (this is absolutely not true, but equality memes like this don’t matter about truth.)
I would guess that different brains are damaged in different ways. Sometimes it’s genetic. Sometimes it’s just too much or too little of some chemical produced in the brain (potentially also for genetic reasons), which might be fixable by a chemical intervention. (Or maybe not, because the damage caused by the chemical imbalance might be irreversible.)
But different brains will require different chemical interventions. Maybe your friend was X-deficient and took extra X, and it made the symptoms go away. But your brain may be Y-deficient, so adding X will not help. Or maybe your brain already has too much X, and adding more X will fuck you up immediately.
If a doctor told me that statistically, people in my condition are likely to benefit from X, and the doctor would prescribe me a safe dose of X, and then monitor whether my condition improves or not… I might actually try it.
But that is completely different from e.g. a friend telling me that they know someone who took X and was happy about the outcome. First, it’s not obvious that X was actually responsible for the outcome. Maybe the person changed a few things in their life at the same time, and something else worked. Or maybe the person is just addicted, and “happiness” is what their addicted brain reports when asked how they feel about taking X. But most importantly, it may be the case that X helps some people, and hurts other people, and this person is a lucky exception, while those bad cases everyone heard about are the rule. And if I tried X and it wouldn’t work for me, I can already predict that the friend’s advice would be something like “try more” or “try something stronger”.
I think I might just commit to staying away from LSD and Mind Illuminated style meditation entirely. Judging by the frequency of word of mouth accounts like this, the chance of going a little or a lot insane while exposed to them seems frighteningly high.
I wonder why these long term effects seem relatively sparsely documented. Maybe you have to take the meditation really seriously and practice diligently for this stuff to have a high chance of happening, and people in this community do that often, but the average study population doesn’t?
There can also be factors in this community that make people both unusually likely to go insane and to also try things like meditation and LSD in an attempt to help themselves. It’s a bit hard to say given that the post is so vague on what exactly “insanity” means, but the examples of acausal trade etc. make me suspect that it’s related to a specific kind of anxiety which seems to be common in the community.
That same kind of anxiety also made me (temporarily) go very slightly crazy many years ago, when I learned about quantum mechanics (and I had neither done psychedelics nor had I yet started meditating at the time), and it feels like the same kind of thing that causes the occasional person to freak out about Roko’s Basilisk. I think those kinds of people are particularly likely to be drawn to LW, because they subconsciously see rationality as a way to try to control their anxiety, and that same thing causes them to seek out psychedelics and meditation. And then rationality, meditation, and psychedelics are all things that might also dismantle some of the existing defenses their mind has against that anxiety.
I suspect it’s related to the fact that we’ve gotten ourselves off-distribution from the emergencies that used to be common, and thus AI and the Singularity are interpreted as immediate emergencies when they aren’t.
I’ll also make a remark that LW focuses on the tails, so things tend to be more extreme than usual.
Yeah, I think people who are high in abstract thinking and believing their beliefs and anxious thought patterns should really stay away from psychedelics and from leaning too hard into their run-away thought trains. Also, try to stay grounded with people and activities that don’t send you off into abstract thought space. Spend some time with calm normal people who look at the world in straightforward ways, not only creative wild thinkers. Spend time doing hobbies outdoors that use your physical body and attention in satisfying ways, keeping you engaged enough to stay out of your head.
I think one who this description fits can avoid any risks of ‘going insane’ while still using their abilities for good. For example, in my own case (I think the first two describe me, and the third one sort-of does), if I were to apply these suggestions..
then my creative output related to alignment would probably drop significantly.
(I agree with not trying psychedelics though. Even e.g nootropics and adhd meds are things I’m really cautious with, cause I don’t wanna mess up some part of my process.)
For anyone reading this post in the future, I’d instead suggest doing things meant to help you channel your ability: being conscious and reflective about your thoughts, revisiting basic rationality techniques and theory occasionally, noticing privileged hypotheses (while still allowing yourself to ponder them if you’re just doing it because you find it interesting; I think letting ones mind explore is also important to generating important ideas and making connections).
“Please don’t throw your mind away” in this other sense of counteracting your tendency to think abstractly; you might be able to do a lot of good with it.
I think your suggestions are good as well. To be clear: I didn’t mean that I think one should spend a large fraction of their time just ‘staying grounded’. More like, a few hours a week.
The way I model attention is that it is (metaphorically) a Cirrus (biology) of thought that you extend into the world and then retract into your mind. If you leave it out for too long, it gets tangled up in the forest of all knowledge, if you keep it inside for too long, then you become unable to respond to your environment.
People who are extremely online tend to send their attention cirrus into the internet, where it is prone to become a host to memes that use addiction to bypass your mind’s typical defenses against infection.
Anything that you really enjoy to the point of losing self-control comes under the category of being a disease: whether that’s social media, programming, fiction, gaming, tentacle pornography, research, or anime.
Even if they were somehow extremely beneficial normally (which is fairly unlikely), any significant risk of going insane seems much too high. I would posit they have such a risk for exactly the same reason -when using them, you are deliberately routing around very fundamental safety features of your mind.
The MBSR studies are two-month interventions. They are not going to have the same strong effects as people meditating seriously for years.
On the other hand, those studies that investigate people who meditate a lot are often from a monastic setting where people have teachers which is quite different from someone meditating without a teacher and orienting themselves with the Mind Illuminated.
Possible selection effect?
Maybe meditation moves people in a random direction. Those who get hurt, mostly stop meditating, so you won’t find many of them in the “meditating seriously for years” group.
I find it ironic that in a community that values clear thinking many people do things with their brains similar to giving their computer a hard kick or setting it on fire and expecting that to improve its performance.
It’s the fucking impulsive contrarianism, isn’t it? The more people keep telling you about those who have ruined their lives by taking drugs, the more certain you feel that you will do it the right way that will magically give you mental superpowers, unlike all those idiots who were simply doing it wrong. Induction is for losers. Also, you are smarter than everyone, and you did your “reseach” on internet, or asked a friend.
I think the long-term effects of LSD and other drugs are documented sufficiently. It’s just, if there are 100 boring statistics about people who fucked up their lives, and 1 exciting speculative book by Timothy Leary, everyone will talk about the latter.
Ah, you meant meditation. I guess the standard excuse is that people who got hurt by meditation were either doing it wrong, or they had some extremely rare pre-existing condition. (Translated: no matter how many people get hurt by doing X, it obviously does not apply to me. Because they were stupid and they were doing it wrong, and I am smart and I will be doing it right. Also, they were weak, and I am invulnerable.) The same excuse applies to people who join an MLM pyramid scheme and lose their money, or people who pray to God to cure their sick child and then the child dies anyway. The theory is always correct; if it doesn’t work for you, you were clearly applying it incorrectly.
How many more people have to die, before we learn the thing that a random 10 years old kid could tell us?
I think that before you write strongly-worded comments accusing people of being idiots for privileging anecdotes more heavily than statistics, you should first establish that the side you’re taking is actually the one supported by statistics and that it’s the other side which is relying on anecdotes, and not vice versa.
My read is that for meditation and psychedelics, the actual research tends to show that they are generally low-risk/beneficial (even to the point of their mental health benefits starting to gradually overcome the stigma against psychedelics among academic researchers) [e.g. 1, 2, 3 for psychedelics] and it’s actually the bad cases that are the unrepresentative anecdotes.
How unrepresentative? What probability of becoming the “bad case” would you consider acceptable?
If there is a hypothetical number of bad cases that would make you change your mind if all those bad cases happened inside the rationality community, how big approximately would that number be?
Acceptable for writing highly derisive comments about people who try psychedelics? I’m not really a fan of that approach in any case, tbh.
Acceptable for psychedelics being worth trying? I don’t know, that seems like it would depend on the person’s risk tolerance and what they’re hoping to get out of it. I don’t consider it my business to decide e.g. what level of risk is unacceptable if someone wants to try extreme sports, nor do I consider it my business to tell people at what risk level they are allowed to try out psychedelics.
I’m more in favor of talking about the possible risks honestly and openly but without exaggeration, and also talking about responsible use, how to ameliorate the risks, and what the possible risk factors are.
The point of Kaj Sotala’s comment is that there is a selection bias that is severe enough that your comments need to have major caveats to them (deepthoughtlife made a similar error.) I won’t determine your risk tolerance for medicine, but what I can say is that we should update in the opposite direction: That psychedelics are safe and maybe useful for the vast majority of people, and the ones that were truly harmed are paraded as anecdotes, showing massive selection biases, and not representing the median person in the world.
Suppose you start taking LSD. Not as a part of a scientific experiment where the dosage was reviewed and approved by a research ethics board, but based on a recommendation of your friend and an internet research you did yourself, using doses as big as your friend/research recommends, repeating as often as your friend/research considers safe.
(Maybe, let’s also include the risk of self-modification, e.g. the probability that once you overcome the taboo and find the results of the experiment appealing, you may be tempted to try a greater dose the next time, or increase the frequency. I am mentioning this, because—yes, anecdotally—people experimenting with psychoactive substances sometimes do exactly this.)
Are you saying that the probability of serious and irreversible harm to your brain is smaller than 1%?
Or are you saying that the potential benefits are so large, that the 1% chance of seriously and irreversibly harming your brain is totally worth it?
I think that at least one of these two statements needs to be true, in order to make experimenting with LSD worth it. I just don’t know which one (or possibly both?) are you making.
Note that the 1% probability of hurting your brain (heck, even 40% probability) is still hypothetically compatible with the statement that for a median person the experiment is a net benefit.
I suspect a large part of the problem is LW is trying to find answers to a field that has no reliable results, improving minds (aka nootropics.) We know that existing drugs at most give you emotional health, and even here there are some limits to that. So it’s not surprising that attempts to improve minds fail a lot presently.
I would place a 80-90% prior probability that the boring answer is correct, that the brain is a complicated mess that is hard to affect in non-genetic ways. That stated, even if genetics can figure out how to improve intelligence, there’s a further problem in that people would figuratively riot because of equality memes that say that intelligence doesn’t matter and that anyone can become talented (this is absolutely not true, but equality memes like this don’t matter about truth.)
I would guess that different brains are damaged in different ways. Sometimes it’s genetic. Sometimes it’s just too much or too little of some chemical produced in the brain (potentially also for genetic reasons), which might be fixable by a chemical intervention. (Or maybe not, because the damage caused by the chemical imbalance might be irreversible.)
But different brains will require different chemical interventions. Maybe your friend was X-deficient and took extra X, and it made the symptoms go away. But your brain may be Y-deficient, so adding X will not help. Or maybe your brain already has too much X, and adding more X will fuck you up immediately.
If a doctor told me that statistically, people in my condition are likely to benefit from X, and the doctor would prescribe me a safe dose of X, and then monitor whether my condition improves or not… I might actually try it.
But that is completely different from e.g. a friend telling me that they know someone who took X and was happy about the outcome. First, it’s not obvious that X was actually responsible for the outcome. Maybe the person changed a few things in their life at the same time, and something else worked. Or maybe the person is just addicted, and “happiness” is what their addicted brain reports when asked how they feel about taking X. But most importantly, it may be the case that X helps some people, and hurts other people, and this person is a lucky exception, while those bad cases everyone heard about are the rule. And if I tried X and it wouldn’t work for me, I can already predict that the friend’s advice would be something like “try more” or “try something stronger”.