Well, yeah, if we knew what you call S (that AGI would occur in 2011 or would never occur), then our surviving 2011 would mean that AGI will never occur.
But your example fails to shed light on the argument in great grandparent.
If I may suggest a different example, one which I believe is analogous to the argument in great grandparent:
Suppose I give you a box that displays either “heads” or “tails” when you press a button on the box.
The reason I want you to consider a box rather than a coin is that a person can make a pretty good estimate of the “fairness” of a coin just by looking at it and hold it in one’s hand.
Do not make any assumptions about the “fairness” of the box. Do not for example assume that if you push the button a million times, the box would display “heads” about 500,000 times.
What is your probability that the box will display “heads” when you push the button?
.5 obviously because even if the box is extremely “unfair” or biased, you have no way to know whether it is biased towards “heads” or biased towards “tails”.
Suppose further that you cannot survive the box coming up “tails”.
Now suppose you push the button ten times and of course it comes up “heads” all ten times.
Updating on the results of your first ten button-presses, what is your probability that it will come up “heads” if you push the button an eleventh time?
Do you for example say, “Well, clearly this box is very biased towards heads.”
Do you use Laplace’s law of succession to compute the probability?
This is more or less what I was trying to do, but I neglected to treat “AGI is impossible” as equivalent to “AGI will never happen”.
I need to have a prior in order to update, so sure, let’s use Laplace.
I’d have to be an idiot to ever press the button at all, but let’s say I’m in Harry’s situation with the time-turner and someone else pushed the button ten times before I could tell them not to.
I don’t feel like doing the calculus to actually apply Bayes myself here, so I’ll use my vague nonunderstanding of Wikipedia’s formula for the rule of succession and say p=11/12.
Well, yeah, if we knew what you call S (that AGI would occur in 2011 or would never occur), then our surviving 2011 would mean that AGI will never occur.
But your example fails to shed light on the argument in great grandparent.
If I may suggest a different example, one which I believe is analogous to the argument in great grandparent:
Suppose I give you a box that displays either “heads” or “tails” when you press a button on the box.
The reason I want you to consider a box rather than a coin is that a person can make a pretty good estimate of the “fairness” of a coin just by looking at it and hold it in one’s hand.
Do not make any assumptions about the “fairness” of the box. Do not for example assume that if you push the button a million times, the box would display “heads” about 500,000 times.
What is your probability that the box will display “heads” when you push the button?
.5 obviously because even if the box is extremely “unfair” or biased, you have no way to know whether it is biased towards “heads” or biased towards “tails”.
Suppose further that you cannot survive the box coming up “tails”.
Now suppose you push the button ten times and of course it comes up “heads” all ten times.
Updating on the results of your first ten button-presses, what is your probability that it will come up “heads” if you push the button an eleventh time?
Do you for example say, “Well, clearly this box is very biased towards heads.”
Do you use Laplace’s law of succession to compute the probability?
This is more or less what I was trying to do, but I neglected to treat “AGI is impossible” as equivalent to “AGI will never happen”.
I need to have a prior in order to update, so sure, let’s use Laplace.
I’d have to be an idiot to ever press the button at all, but let’s say I’m in Harry’s situation with the time-turner and someone else pushed the button ten times before I could tell them not to.
I don’t feel like doing the calculus to actually apply Bayes myself here, so I’ll use my vague nonunderstanding of Wikipedia’s formula for the rule of succession and say p=11/12.