Due to updates about simulation shutdown risk and the difficulty of FAI philosophy (I think it’s easier than I used to believe, though still very hard), I think an FAI team is a better idea than I thought four months ago.
… and slightly upgraded my expectation of human non-extinction.
Damn it is easy to let other people do (some of) your thinking for you.
I just changed my mind in this direction:
… and slightly upgraded my expectation of human non-extinction.
Damn it is easy to let other people do (some of) your thinking for you.