According to the argument here, we are at a nearly perfectly average place in the series of habitable planet years, rather than early in the universe as it first appears. If this is the case, it strongly suggests that the human race will go extinct on earth, rather than moving to any other place, ever. I think this is probably what will happen.
And even if it does not, physics suggests even more strongly that the human race will go extinct sooner or later. I am fairly sure this will happen.
The DA just supports all of those things that we already know: there is no reason except wishful thinking to think that humans will not go extinct in a normal way.
Totally agree. But in my interpretation of ADT, the DA should not stop us from trying to survive (in a comment above Stuart said that it is not DA, but “presumptuous philosopher” paradox) as there is still a small chance.
I also use what I call Meta Doomsday argument. It basically said that there is a logical uncertainty about if DA or any of its version are true, and thus we should give some subjective probability Ps to the DA is true. Let’s say it is 0.5.
As DA is also a probabilistic argument, we should multiply Ps on DA’s probability shift, and we will still get a large update in the extinction probability as a result.
According to the argument here, we are at a nearly perfectly average place in the series of habitable planet years, rather than early in the universe as it first appears. If this is the case, it strongly suggests that the human race will go extinct on earth, rather than moving to any other place, ever. I think this is probably what will happen.
And even if it does not, physics suggests even more strongly that the human race will go extinct sooner or later. I am fairly sure this will happen.
The DA just supports all of those things that we already know: there is no reason except wishful thinking to think that humans will not go extinct in a normal way.
Totally agree. But in my interpretation of ADT, the DA should not stop us from trying to survive (in a comment above Stuart said that it is not DA, but “presumptuous philosopher” paradox) as there is still a small chance.
I also use what I call Meta Doomsday argument. It basically said that there is a logical uncertainty about if DA or any of its version are true, and thus we should give some subjective probability Ps to the DA is true. Let’s say it is 0.5.
As DA is also a probabilistic argument, we should multiply Ps on DA’s probability shift, and we will still get a large update in the extinction probability as a result.
I agree with all this.