If by undefined you mean that we don’t know what the value is, then no, it doesn’t bother me. If by undefined you mean that they have no truth value, as standard, then I don’t think they are undefined.
Yeah, I actually regretted those choices last year. I ended up not using the Litany of Tarski at the Big Solstice this year, but if I had, I’d have stuck to things where the truth couldn’t be dependent on what people believed the truth was.
Oh, I’m surprised. Do you mean that, e.g., being “outcompeted by simulated brains in a Malthusian hellhole race to the bottom” is less likely if more people believe that being “outcompeted by simulated brains in a Malthusian hellhole race to the bottom” is more likely, thus leading to an inconsistency?
Awareness-raising is really important and high-value, but unfortunately it only makes a marginal dent in x-risk. I mean, it may be rational to anticipate a scarily high probability of a Malthusian hellhole race to the bottom whether or not we personally try to stop it. The Solstice hymnal and ritual made me realize that on an emotional level.
On the other hand, maybe the probability of existential disasters conditioning on our best efforts is a thought that’s too demoralizing even for a Solstice ritual.
1) Solstice is meant to be scary. (How scary exactly depends on which crowd we’re doing it for). “The world may end and it may in fact be dependent on our actions” is a primary point to it.
2) You’re on a short list of people who have described the Solstice as actually helping you realize things on an emotional level, which was an intended purpose. So, good to know.
3) On one hand, “Outcompeted by simulated brains in a Malthusian Hellhole race to the bottom”’s probability may not depend that much on our personal actions, and framing the question is useful. But I did find it distracting to notice that the outcome dependent at least somewhat on my beliefs, and also might depend on the collective beliefs of everyone who attends Solstices, and I should take responsibility for that.
I also think it’s useful to distinguish between Epistemic Rationality Rituals and Instrumental Rationality Rituals.
4) Re: your other comment about Tarski’s theorem—interesting. Kind of wrapping my brain around that now.
Fun fact! Paradoxical propositions that are true if and only if you don’t believe them are at the heart of Tarski’s theorem on the undefinability of truth, and MIRI has figured out a way to make sense of them. Basically, if proposition P is true if and only if you assign less than 10% probability to P, then you ought to assign probability 10% to P, and you ought to believe that you assign probability “approximately 10%” to P.
See somervta’s comment above. But, I disagree with them on their second point.
If, in response to “If I’m (not) going to be outcompeted by simulated brains, I desire to (not) believe...”, I asked you “Am I going to be outcompeted by simulated brains?” you probably wouldn’t say “yes” or “no”. There’s no territory to match up with the map, i.e. your belief of whether or not we’ll be outcompeted.
I don’t know… Maybe people define territory differently, to include events that haven’t happened and things that don’t exist yet?
You can say something like “if I am going to be outcompeted by simulated brains in X% of Everett branches”, which is part of the territory (if you accept many-worlds), but is not verifiable. I agree that it’s better to stick with testable statements, especially if introducing people to the Litany of Tarski, so we will be more careful with this for next year’s Solstice.
Ah, I see, thanks. I have to agree with PhilipL that applying the template to a possible future event turns the original meaning upside down. Unless maybe if you subscribe to Eliezer’s idiosyncratic timeless “block universe” view.
Nitpick: Does it bug anyone else to apply the Litany of Tarski on statements with undefined truth values (e.g. numbers 2-4 above), or is it just me?
If by undefined you mean that we don’t know what the value is, then no, it doesn’t bother me. If by undefined you mean that they have no truth value, as standard, then I don’t think they are undefined.
Yeah, I actually regretted those choices last year. I ended up not using the Litany of Tarski at the Big Solstice this year, but if I had, I’d have stuck to things where the truth couldn’t be dependent on what people believed the truth was.
Oh, I’m surprised. Do you mean that, e.g., being “outcompeted by simulated brains in a Malthusian hellhole race to the bottom” is less likely if more people believe that being “outcompeted by simulated brains in a Malthusian hellhole race to the bottom” is more likely, thus leading to an inconsistency?
Awareness-raising is really important and high-value, but unfortunately it only makes a marginal dent in x-risk. I mean, it may be rational to anticipate a scarily high probability of a Malthusian hellhole race to the bottom whether or not we personally try to stop it. The Solstice hymnal and ritual made me realize that on an emotional level.
On the other hand, maybe the probability of existential disasters conditioning on our best efforts is a thought that’s too demoralizing even for a Solstice ritual.
Huh. Couple thoughts:
1) Solstice is meant to be scary. (How scary exactly depends on which crowd we’re doing it for). “The world may end and it may in fact be dependent on our actions” is a primary point to it.
2) You’re on a short list of people who have described the Solstice as actually helping you realize things on an emotional level, which was an intended purpose. So, good to know.
3) On one hand, “Outcompeted by simulated brains in a Malthusian Hellhole race to the bottom”’s probability may not depend that much on our personal actions, and framing the question is useful. But I did find it distracting to notice that the outcome dependent at least somewhat on my beliefs, and also might depend on the collective beliefs of everyone who attends Solstices, and I should take responsibility for that.
I also think it’s useful to distinguish between Epistemic Rationality Rituals and Instrumental Rationality Rituals.
4) Re: your other comment about Tarski’s theorem—interesting. Kind of wrapping my brain around that now.
Fun fact! Paradoxical propositions that are true if and only if you don’t believe them are at the heart of Tarski’s theorem on the undefinability of truth, and MIRI has figured out a way to make sense of them. Basically, if proposition P is true if and only if you assign less than 10% probability to P, then you ought to assign probability 10% to P, and you ought to believe that you assign probability “approximately 10%” to P.
Why do these statements have undefined truth values?
See somervta’s comment above. But, I disagree with them on their second point.
If, in response to “If I’m (not) going to be outcompeted by simulated brains, I desire to (not) believe...”, I asked you “Am I going to be outcompeted by simulated brains?” you probably wouldn’t say “yes” or “no”. There’s no territory to match up with the map, i.e. your belief of whether or not we’ll be outcompeted.
I don’t know… Maybe people define territory differently, to include events that haven’t happened and things that don’t exist yet?
Yep! Check out the B-theory of time.
You can say something like “if I am going to be outcompeted by simulated brains in X% of Everett branches”, which is part of the territory (if you accept many-worlds), but is not verifiable. I agree that it’s better to stick with testable statements, especially if introducing people to the Litany of Tarski, so we will be more careful with this for next year’s Solstice.
What numbers and where are you referring to? I only see a bunch of song titles.
PhilipL is referring to the second, third and final Litanies of Tarski (in the Twilight section and the second Light section).
Ah, I see, thanks. I have to agree with PhilipL that applying the template to a possible future event turns the original meaning upside down. Unless maybe if you subscribe to Eliezer’s idiosyncratic timeless “block universe” view.
note: shminux is a particularly vocal individual who strongly disagrees with the timeless “block universe” model
I don’t agree or disagree with untestables.