If you know you probably would not have survived the sun’s having failed to rise, you cannot just apply the Rule of Succession to your knowledge of past sunrises to calculate the probability that the sun will rise tomorrow because that would be ignoring relevant information, namely the existence of a severe selection bias. (Sadly, I do not know how to modify the Rule of Succession to account for the selection bias.)
On the other hand, if you have so much background knowledge about the Sun that you can think about the selection effects involved, the Rule of Succession is a moot & incomplete analysis to begin with.
Regarding your second paragraph, Sir Gwern, if we switch the example to the question of whether the US and Russia will launch nukes at each other this year, I have at lot of information about the strength of the selection bias (including for example Carl Sagan’s work on nuclear winter) that I might put to good use if I knew how to account for selection effects, but I would be sorely tempted to use something like the Rule of Succession (modified to account for the selection bias and where the analog of a day in which the sun might or might not rise is the start of the part of the career of someone in the military or in politics during which he or she can influence whether or not an attempt at a first strike is made) because my causal model of the mental processes behind the decision to launch is so unsatisfactory.
This might be a good place for me to point out that I never bought into the common wisdom, which I have never seen anyone object to or distance themselves from in print, that the chances of a nuclear exchange between the US and Russia went down considerably after the collapse of the Soviet Union in 1991.
This might be a good place for me to point out that I never bought into the common wisdom, which I have never seen anyone object to or distance themselves from in print, that the chances of a nuclear exchange between the US and Russia went down considerably after the collapse of the Soviet Union in 1991.
Nuclear war isn’t the same situation, though. We can survive nuclear war at all sorts of levels of intensity, so the selection filter is not nearly the same as “the Sun going out”, which is ~100% fatal. Bostrom’s shadow paper might actually work for nuclear war, from the perspective of a revived civilization, but I’d have to reread it to see.
The selection filter does not have to be total or near total for my point to stand, namely, Rule-of-Succession-like calculations can be useful even when one has enough information to think about the selection effects involved (provided that Rule-of-Succession-like calculations are ever useful).
And parenthetically selection effects on observations about whether nuclear exchanges happened in the past can be very strong. Consider for example a family who has lived in Washington, D.C., for the last 5 decades: Washington, D.C., is such an important target that it is unlikely the family would have survived the launch of most or all of the Soviet/Russian arsenal at the U.S. So, although I agree with you that the human race as a whole would probably have survived almost any plausible nuclear exchange, that does not do the family in D.C. much good. More precisely, it does not do much good for the family’s ability to use historical data on whether or not nukes were launched at the U.S. in the past to refine their probability of launches in the future.
If you know you probably would not have survived the sun’s having failed to rise, you cannot just apply the Rule of Succession to your knowledge of past sunrises to calculate the probability that the sun will rise tomorrow because that would be ignoring relevant information, namely the existence of a severe selection bias. (Sadly, I do not know how to modify the Rule of Succession to account for the selection bias.)
Bostrom has made a stab at compensating, although I don’t think http://www.nickbostrom.com/papers/anthropicshadow.pdf works for the sun example.
On the other hand, if you have so much background knowledge about the Sun that you can think about the selection effects involved, the Rule of Succession is a moot & incomplete analysis to begin with.
Regarding your second paragraph, Sir Gwern, if we switch the example to the question of whether the US and Russia will launch nukes at each other this year, I have at lot of information about the strength of the selection bias (including for example Carl Sagan’s work on nuclear winter) that I might put to good use if I knew how to account for selection effects, but I would be sorely tempted to use something like the Rule of Succession (modified to account for the selection bias and where the analog of a day in which the sun might or might not rise is the start of the part of the career of someone in the military or in politics during which he or she can influence whether or not an attempt at a first strike is made) because my causal model of the mental processes behind the decision to launch is so unsatisfactory.
This might be a good place for me to point out that I never bought into the common wisdom, which I have never seen anyone object to or distance themselves from in print, that the chances of a nuclear exchange between the US and Russia went down considerably after the collapse of the Soviet Union in 1991.
What’s your line of thought?
Nuclear war isn’t the same situation, though. We can survive nuclear war at all sorts of levels of intensity, so the selection filter is not nearly the same as “the Sun going out”, which is ~100% fatal. Bostrom’s shadow paper might actually work for nuclear war, from the perspective of a revived civilization, but I’d have to reread it to see.
The selection filter does not have to be total or near total for my point to stand, namely, Rule-of-Succession-like calculations can be useful even when one has enough information to think about the selection effects involved (provided that Rule-of-Succession-like calculations are ever useful).
And parenthetically selection effects on observations about whether nuclear exchanges happened in the past can be very strong. Consider for example a family who has lived in Washington, D.C., for the last 5 decades: Washington, D.C., is such an important target that it is unlikely the family would have survived the launch of most or all of the Soviet/Russian arsenal at the U.S. So, although I agree with you that the human race as a whole would probably have survived almost any plausible nuclear exchange, that does not do the family in D.C. much good. More precisely, it does not do much good for the family’s ability to use historical data on whether or not nukes were launched at the U.S. in the past to refine their probability of launches in the future.
An interesting bracket style. How am I supposed to know where the parenthetical ends?