Irrationality game:
Nice idea. This way I can safely test whether the Baseline of my opinion on LW topics is as contrarian as I think.
My proposition:
On The Simulation Argument I go for “(1) the human species is very likely to go extinct before reaching a “posthuman” stage” (80%)
Correspondingly on The Great Filter I go for failure to reach “9. Colonization explosion” (80%).
This is not because I think that humanity is going to self-annihilate soon (though this is a possibility).
What is the extinction scenario you have in mind?
Hopefully no extinction during the next many thousends of years. Which extinction afterwards is difficult to predict.
As I shortly argued in my baseline post I think that posthuman state is unlikely due to thermodynamics/complexity constraints.
Irrationality game:
Nice idea. This way I can safely test whether the Baseline of my opinion on LW topics is as contrarian as I think.
My proposition:
On The Simulation Argument I go for “(1) the human species is very likely to go extinct before reaching a “posthuman” stage” (80%)
Correspondingly on The Great Filter I go for failure to reach “9. Colonization explosion” (80%).
This is not because I think that humanity is going to self-annihilate soon (though this is a possibility).
What is the extinction scenario you have in mind?
Hopefully no extinction during the next many thousends of years. Which extinction afterwards is difficult to predict.
As I shortly argued in my baseline post I think that posthuman state is unlikely due to thermodynamics/complexity constraints.