This. I’ve been known to say that if I were a billionaire, my third priority would be building a ridiculous castle and living out my days as an eccentric headmaster.
This belies the more down-to-Earth intention that if, after looking into it in more detail, FAI and life extension both seem like they’ll be insufficient in my lifetime to prevent biological death (even if not information theoretic death), investing in injecting sanity (even if concentrated in a fewworld-beaters) into the world would be a likely next priority. (Cf. MIRI, CFAR.) So I’m definitely interested in the idea of rational!Academy.
This. I’ve been known to say that if I were a billionaire, my third priority would be building a ridiculous castle and living out my days as an eccentric headmaster.
This belies the more down-to-Earth intention that if, after looking into it in more detail, FAI and life extension both seem like they’ll be insufficient in my lifetime to prevent biological death (even if not information theoretic death), investing in injecting sanity (even if concentrated in a few world-beaters) into the world would be a likely next priority. (Cf. MIRI, CFAR.) So I’m definitely interested in the idea of rational!Academy.