Yeah, I actually regretted those choices last year. I ended up not using the Litany of Tarski at the Big Solstice this year, but if I had, I’d have stuck to things where the truth couldn’t be dependent on what people believed the truth was.
Oh, I’m surprised. Do you mean that, e.g., being “outcompeted by simulated brains in a Malthusian hellhole race to the bottom” is less likely if more people believe that being “outcompeted by simulated brains in a Malthusian hellhole race to the bottom” is more likely, thus leading to an inconsistency?
Awareness-raising is really important and high-value, but unfortunately it only makes a marginal dent in x-risk. I mean, it may be rational to anticipate a scarily high probability of a Malthusian hellhole race to the bottom whether or not we personally try to stop it. The Solstice hymnal and ritual made me realize that on an emotional level.
On the other hand, maybe the probability of existential disasters conditioning on our best efforts is a thought that’s too demoralizing even for a Solstice ritual.
1) Solstice is meant to be scary. (How scary exactly depends on which crowd we’re doing it for). “The world may end and it may in fact be dependent on our actions” is a primary point to it.
2) You’re on a short list of people who have described the Solstice as actually helping you realize things on an emotional level, which was an intended purpose. So, good to know.
3) On one hand, “Outcompeted by simulated brains in a Malthusian Hellhole race to the bottom”’s probability may not depend that much on our personal actions, and framing the question is useful. But I did find it distracting to notice that the outcome dependent at least somewhat on my beliefs, and also might depend on the collective beliefs of everyone who attends Solstices, and I should take responsibility for that.
I also think it’s useful to distinguish between Epistemic Rationality Rituals and Instrumental Rationality Rituals.
4) Re: your other comment about Tarski’s theorem—interesting. Kind of wrapping my brain around that now.
Fun fact! Paradoxical propositions that are true if and only if you don’t believe them are at the heart of Tarski’s theorem on the undefinability of truth, and MIRI has figured out a way to make sense of them. Basically, if proposition P is true if and only if you assign less than 10% probability to P, then you ought to assign probability 10% to P, and you ought to believe that you assign probability “approximately 10%” to P.
Yeah, I actually regretted those choices last year. I ended up not using the Litany of Tarski at the Big Solstice this year, but if I had, I’d have stuck to things where the truth couldn’t be dependent on what people believed the truth was.
Oh, I’m surprised. Do you mean that, e.g., being “outcompeted by simulated brains in a Malthusian hellhole race to the bottom” is less likely if more people believe that being “outcompeted by simulated brains in a Malthusian hellhole race to the bottom” is more likely, thus leading to an inconsistency?
Awareness-raising is really important and high-value, but unfortunately it only makes a marginal dent in x-risk. I mean, it may be rational to anticipate a scarily high probability of a Malthusian hellhole race to the bottom whether or not we personally try to stop it. The Solstice hymnal and ritual made me realize that on an emotional level.
On the other hand, maybe the probability of existential disasters conditioning on our best efforts is a thought that’s too demoralizing even for a Solstice ritual.
Huh. Couple thoughts:
1) Solstice is meant to be scary. (How scary exactly depends on which crowd we’re doing it for). “The world may end and it may in fact be dependent on our actions” is a primary point to it.
2) You’re on a short list of people who have described the Solstice as actually helping you realize things on an emotional level, which was an intended purpose. So, good to know.
3) On one hand, “Outcompeted by simulated brains in a Malthusian Hellhole race to the bottom”’s probability may not depend that much on our personal actions, and framing the question is useful. But I did find it distracting to notice that the outcome dependent at least somewhat on my beliefs, and also might depend on the collective beliefs of everyone who attends Solstices, and I should take responsibility for that.
I also think it’s useful to distinguish between Epistemic Rationality Rituals and Instrumental Rationality Rituals.
4) Re: your other comment about Tarski’s theorem—interesting. Kind of wrapping my brain around that now.
Fun fact! Paradoxical propositions that are true if and only if you don’t believe them are at the heart of Tarski’s theorem on the undefinability of truth, and MIRI has figured out a way to make sense of them. Basically, if proposition P is true if and only if you assign less than 10% probability to P, then you ought to assign probability 10% to P, and you ought to believe that you assign probability “approximately 10%” to P.