This is beautifully written and points at what I believe to be deep truths. In particular:
Your brilliant mind can create internal structures that might damn well take over and literally kill you if you don’t take responsibility for this process. You’re looking at your own internal AI risk.
...
Most people wringing their hands about AI seem to let their minds possess them more and more, and pour more & more energy into their minds, in a kind of runaway process that’s stunningly analogous to uFAI.
But I won’t say more about this right now, mostly because I don’t think I can do it justice with the amount of time and effort I’m prepared to invest writing this comment. On that note, I commend your courage in writing and posting this. It’s a delicate needle to thread between many possible expressions that could rub people the wrong way or be majorly misinterpreted.
Instead I’ll say something critical and/or address a potential misinterpretation of your point:
What is this sobriety you advocate for?
I’m concerned that sobriety might be equivocated with giving in to the cognitive bias toward naive/consensus reality. In a sense of the word, that is what “sobriety” is: a balance of cognitive hyperparameters, a psychological attractor that has been highly optimized by evolution and in-lifetime learning. Being sober makes you effective on-distribution. The problem is if the distribution shifts.
I’ve noticed that people who have firsthand experience with psychosis, high doses of psychedelics, or religious/spiritual beliefs tend to have a much easier time “going up shock levels” and taking seriously the full version of AI risk (not just AI tiling the the internet with fake news, but tiling the lightcone with something we have not the ontology to describe). This might sound like a point against AI-risk. But I think it’s because we’re psychologically programmed with deep trust in the fundamental stability of reality, to intuitively believe that things cannot change that much. Having the consensus reality assumption broken once, e.g. by a psychotic episode where you seriously entertain the possibility that the TV is hacking your mind, makes it easier for it to be broken again (e.g. to believe that mind hacking is a cinch for sufficiently intelligent AI). There are clear downsides to this—you’re much more vulnerable too all sorts of unusual beliefs, and most unusual beliefs are false. But some unusual beliefs are true. For instance, I think some form of AI risk both violates consensus reality and is true.
A more prosaic example: in my experience, the absurdity heuristic is one of the main things that prevented and still prevents people from grasping the implications of GPT-3. Updating on words being magic spells that can summon intelligent agents pattern matches against schizophrenia, so the psychological path of least resistance for many people is to downplay and rationalize.
I think there’s a different meaning of sobriety, perhaps what you’re pointing at, that isn’t just an entropic regression toward the consensus. But the easiest way to superficially take the advice of this post, I think—the easiest way out of the AI doom fear attractor—is to fall back into the consensus reality attractor. And maybe this is the healthiest option for some people, but I don’t think they’re going to be useful.
But I agree that being driven by fear, especially fear inherited socially and/or tangled up with trauma, is not the most effective either, and often ends up fueling ironic self fulfilling prophecies and the like. In all likelihood the way out which makes one more able to solve the problem requires continuously threading your own trajectory between various psychological sink states, and a single post is probably not enough to guide the way to that “exit”. (But that doesn’t mean it’s not valuable)
Ah, I’m really glad you asked. I tried to define it implicitly in the post but I was maybe too subtle.
There’s this specific engine of addiction. It’s the thing that distracts without addressing the cause, and becomes your habitual go-to for dealing with the Bad Thing. That creates a feedback loop.
Sobriety is with respect to an addiction. It means dropping the distraction and facing & addressing the thing you’d been previously distracting yourself from, until the temptation to distract yourself extinguishes.
Alcohol being a seed example (hence “sobriety”). The engine of alcoholism is complex, but ultimately there’s an underlying thing (sometimes biochemical, but very often emotional) that’s a sensation the alcoholic’s mind/body system has identified as “intolerable — to be avoided”. Alcohol is a great numbing agent and can create a lot of unrelated sensations (like dizziness), but it doesn’t address (say) feelings of inadequacy.
So getting sober isn’t just a matter of “don’t drink alcohol”, but of facing the things that drive the impulse to reach for the bottle. When you extinguish the cause, the effect evaporates on its own — modulo habits.
I’ve witnessed this kind of addiction engine at play for a lot of rationalists. I don’t have statistics here, or a sense of how widespread it is, but it’s common enough that it’s an invitation woven into the culture. Kind of like alcohol is woven into mainstream culture. The addiction in this case is to a particular genre of intense thought — which, like alcohol, acts like a kind of numbing agent.
So in the same way, by “get sober” I’m pointing at facing the SNS energy driving the intense thought, and getting that to settle down and digest, instead of just believing the thoughts point-blank. To get to a point where you don’t need the thoughts to be distracting. And then the mind can be useful to think about stuff that can freak you out.
But not so much before.
…kind of like, an alcohol-laden mind can’t think things through very well, and an alcoholic’s mind isn’t well-suited to deciding whether to have another drink even when they aren’t currently drunk.
So, no, I don’t mean anything about drifting back toward mainstream consensus reality. I’m talking about a very specific mechanism. Getting off a specific drug long enough to stop craving it.
This is beautifully written and points at what I believe to be deep truths. In particular:
But I won’t say more about this right now, mostly because I don’t think I can do it justice with the amount of time and effort I’m prepared to invest writing this comment. On that note, I commend your courage in writing and posting this. It’s a delicate needle to thread between many possible expressions that could rub people the wrong way or be majorly misinterpreted.
Instead I’ll say something critical and/or address a potential misinterpretation of your point:
What is this sobriety you advocate for?
I’m concerned that sobriety might be equivocated with giving in to the cognitive bias toward naive/consensus reality. In a sense of the word, that is what “sobriety” is: a balance of cognitive hyperparameters, a psychological attractor that has been highly optimized by evolution and in-lifetime learning. Being sober makes you effective on-distribution. The problem is if the distribution shifts.
I’ve noticed that people who have firsthand experience with psychosis, high doses of psychedelics, or religious/spiritual beliefs tend to have a much easier time “going up shock levels” and taking seriously the full version of AI risk (not just AI tiling the the internet with fake news, but tiling the lightcone with something we have not the ontology to describe). This might sound like a point against AI-risk. But I think it’s because we’re psychologically programmed with deep trust in the fundamental stability of reality, to intuitively believe that things cannot change that much. Having the consensus reality assumption broken once, e.g. by a psychotic episode where you seriously entertain the possibility that the TV is hacking your mind, makes it easier for it to be broken again (e.g. to believe that mind hacking is a cinch for sufficiently intelligent AI). There are clear downsides to this—you’re much more vulnerable too all sorts of unusual beliefs, and most unusual beliefs are false. But some unusual beliefs are true. For instance, I think some form of AI risk both violates consensus reality and is true.
A more prosaic example: in my experience, the absurdity heuristic is one of the main things that prevented and still prevents people from grasping the implications of GPT-3. Updating on words being magic spells that can summon intelligent agents pattern matches against schizophrenia, so the psychological path of least resistance for many people is to downplay and rationalize.
I think there’s a different meaning of sobriety, perhaps what you’re pointing at, that isn’t just an entropic regression toward the consensus. But the easiest way to superficially take the advice of this post, I think—the easiest way out of the AI doom fear attractor—is to fall back into the consensus reality attractor. And maybe this is the healthiest option for some people, but I don’t think they’re going to be useful.
But I agree that being driven by fear, especially fear inherited socially and/or tangled up with trauma, is not the most effective either, and often ends up fueling ironic self fulfilling prophecies and the like. In all likelihood the way out which makes one more able to solve the problem requires continuously threading your own trajectory between various psychological sink states, and a single post is probably not enough to guide the way to that “exit”. (But that doesn’t mean it’s not valuable)
Ah, I’m really glad you asked. I tried to define it implicitly in the post but I was maybe too subtle.
There’s this specific engine of addiction. It’s the thing that distracts without addressing the cause, and becomes your habitual go-to for dealing with the Bad Thing. That creates a feedback loop.
Sobriety is with respect to an addiction. It means dropping the distraction and facing & addressing the thing you’d been previously distracting yourself from, until the temptation to distract yourself extinguishes.
Alcohol being a seed example (hence “sobriety”). The engine of alcoholism is complex, but ultimately there’s an underlying thing (sometimes biochemical, but very often emotional) that’s a sensation the alcoholic’s mind/body system has identified as “intolerable — to be avoided”. Alcohol is a great numbing agent and can create a lot of unrelated sensations (like dizziness), but it doesn’t address (say) feelings of inadequacy.
So getting sober isn’t just a matter of “don’t drink alcohol”, but of facing the things that drive the impulse to reach for the bottle. When you extinguish the cause, the effect evaporates on its own — modulo habits.
I’ve witnessed this kind of addiction engine at play for a lot of rationalists. I don’t have statistics here, or a sense of how widespread it is, but it’s common enough that it’s an invitation woven into the culture. Kind of like alcohol is woven into mainstream culture. The addiction in this case is to a particular genre of intense thought — which, like alcohol, acts like a kind of numbing agent.
So in the same way, by “get sober” I’m pointing at facing the SNS energy driving the intense thought, and getting that to settle down and digest, instead of just believing the thoughts point-blank. To get to a point where you don’t need the thoughts to be distracting. And then the mind can be useful to think about stuff that can freak you out.
But not so much before.
…kind of like, an alcohol-laden mind can’t think things through very well, and an alcoholic’s mind isn’t well-suited to deciding whether to have another drink even when they aren’t currently drunk.
So, no, I don’t mean anything about drifting back toward mainstream consensus reality. I’m talking about a very specific mechanism. Getting off a specific drug long enough to stop craving it.
Now that you’ve explained this seems obviously the right sense of sobriety given the addiction analogy. Thank you!
Quite welcome.