1) the math may work out for this, but you’re giving up a lot of potential-existence-time to do so (halfway or more to the heat-death of the universe).
2) we haven’t gotten off this planet, let alone to another star, so it seems a bit premature to plan to get out of many-eon light cones.
3) If there is an event that shows offence stronger than defense (and you’re a defender), it’s too late to get away.
4) Wherever you go, you’re bringing the seeds of such an event with you—there’s nothing that will make you or your colony immune from whatever went wrong for the rest of the known intelligent life in the universe.
(1) Agreed, although I would get vastly more resources to personally consume! Free energy is probably the binding limitation on computational time which probably is the post-singularity binding limit on meaningful lifespan.
(2) An intelligence explosion might collapse to minutes the time between when humans could walk on Mars and when my idea becomes practical to implement.
(3) Today offense is stronger than defense, yet I put a high probability on my personally being able to survive another year.
(4) Perhaps. But what might go wrong is a struggle for limited resources among people with sharply conflicting values. If, today, a small group of people carefully chosen by some leader such as Scott Alexander could move to an alternate earth in another Hubble volume, and he picked me to be in the group, I would greatly increase the estimate of the civilization I’m part of surviving a million years.
1) the math may work out for this, but you’re giving up a lot of potential-existence-time to do so (halfway or more to the heat-death of the universe).
2) we haven’t gotten off this planet, let alone to another star, so it seems a bit premature to plan to get out of many-eon light cones.
3) If there is an event that shows offence stronger than defense (and you’re a defender), it’s too late to get away.
4) Wherever you go, you’re bringing the seeds of such an event with you—there’s nothing that will make you or your colony immune from whatever went wrong for the rest of the known intelligent life in the universe.
(1) Agreed, although I would get vastly more resources to personally consume! Free energy is probably the binding limitation on computational time which probably is the post-singularity binding limit on meaningful lifespan.
(2) An intelligence explosion might collapse to minutes the time between when humans could walk on Mars and when my idea becomes practical to implement.
(3) Today offense is stronger than defense, yet I put a high probability on my personally being able to survive another year.
(4) Perhaps. But what might go wrong is a struggle for limited resources among people with sharply conflicting values. If, today, a small group of people carefully chosen by some leader such as Scott Alexander could move to an alternate earth in another Hubble volume, and he picked me to be in the group, I would greatly increase the estimate of the civilization I’m part of surviving a million years.