I have my own actual best guesses for what happens in reasonably good futures, which I can get into. (I’ll flag for now I think “preserve Earth itself for as long as possible” is a reasonable Schelling point that is compatible with many “go otherwise quite fast” plans)
I doubt that totally dismantling the Sun after centuries would significantly accelerate the time we reach the cosmic event horizon.
Why do you doubt this? (to be clear, depends on exact details. But, my original query was about a 2 year delay. Proxima Centauri is 4 lightyears away. What is your story for how only taking 0.1% of the sun’s energy while we spin up doesn’t slow us down by at least 2 years?
I have more to say but maybe should wait on your answer to that.
Mostly, I think your last comment still had it’s own missing mood of horror, and/or seemed to be assuming away any tradeoffs.
(I am with you on “many rationalists seem gung ho about this in a way I find scary”)
Once centuries have passed, you’ve already sent out huge amounts of space probes that roughly saturate reachable resources. (Because you can convert Proxima Centauri fully into probes within <20 years probably.)
It doesn’t take that much energy to pretty much fully saturate on probes. In particular, Eternity in six hours claims getting the energy for most of the probes you want is possible with just 6 hours of solar output (let alone eating 0.1% of the sun). Even if we assume this off by 2 OOMs (e.g. to be confident you get everywhere you need), that still means we can saturate on energy after 1 month of solar output. If we’re willing to eat 0.1% of the sun (presumably at least millions of years of solar output?), the situation isn’t even close. In fact, the key bottleneck based on Eternity in six hours is disassembling mercury (I think on heat dissipation) though it is hard to be confident in advance.
I agree that I don’t viscerally feel the loss of the 200 galaxies, and maybe that’s a deficiency. But I still find this position crazy. I feel this is a decent parallel dialogue: Other person: “Here is a something I thought of that would increase health outcomes in the world by 0.00000004%.” Me: “But surely you realize that this measure is horrendously unpopular, and the only way to implement it is through a dictatorial world government.” Other person: “Well yes, I agree it’s a hard dilemma, but on absolute terms, 0.00000004% of the world population is 3 people, so my intervention would save 3 lives. Think about how horrible tragedy every death is, I really feel there is a missing mood here when you don’t even consider the upside of my proposal.”
I do feel sad when the democratic governments of the world screw up policy in a big way according to my beliefs and values (as they often do). I still believe that democracy is the least bad form of government, but yes, I feel the temptation of how much better it would be if we were instead governed by perfect philosopher king who happens to be aligned with my values. But 0.00000004% is just a drop in the ocean.
Similarly, I believe that we should try to maintain a relatively democratic form of government at least in the early AI age (and then the people can slowly figure out if they find something better than democracy). And yes, I expect that democracies will do a lot of incredibly wasteful, stupid, and sometimes evil things by my lights, and I will sometimes wish if somehow I could have become the philosopher king. That’s just how things always are. But leaving Earth alone really is chump change, and won’t be among the top thousand things I disagree with democracies on.
(Also, again, I think there will also be a lot of value in conservatism and caution, especially that we probably won’t be able to fully trust our AIs in the most complicated issues. And I also think there is something intrinsically wrong about destroying Earth, I think that if you cut out the world’s oldest tree to increase health outcomes by 0.00000004%, you are doing something wrong, and people have a good reason to distrust you.)
Most decisions are not made democratically, and pointing out that a majoritarian vote is against a decision is no argument that they will not happen nor should not happen. This is true of the vast majority of resource allocation decisions such as how to divvy up physical materials.
I think it is an argument that they should not happen, but a very weak one, especially for things like this where the current public really hasn’t thought much about.
Also, I think David is just wildly wrong here about what realistically would happen in 10,000 years for a society that could actually start using all of the suns energy. This would involve hundreds of generations of people each deciding to not grow, to not expand into the rest of the solar system, to pass up on enormous opportunities for joy and greatness and creation, out of sentimentality for a specific kind of attachment for the specific arrangement of our planet and solar system at this moment in time. This attachment is very much real, and worth something, but IMO obviously will not remotely outweigh the preferences and actions of the trillions of people who will all want to do more things and have more things (and since we are talking about gradual expansion, very much present in the conversation).
I concede that I was mistaken in saying it was no argument; I will agree with the position that it is a very weak one and is often outweighed by other arguments.
Majority vote is useful specifically in determining who has power because of the extremely high level of adversarial dynamics, but in contexts that are not as wildly adversarial (including most specific decisions that an institution makes) generally other decision-making algorithms are better.
You mean that people on Earth and the solar system colonies will have enough biological children, and space travel to other stars for biological people will be hard enough that they will want the resources from dismantling the Sun? I suppose that’s possible, though I expect they will put some kind of population control for biological people in place before that happens. I agree that also feels aversive, but at some point it needs to be done anyway, otherwise exponential population growth just brings us back to the Malthusian limit a few ten thousand years from now even if we use up the whole Universe. (See Tim Underwood’s excellent rationalist sci-fi novel on the topic.)
If you are talking about ems and digital beings, not biological humans, I don’t think they will and should have have decision rights over what happens with the solar system, as they can simply move to other stars.
Someone will live on old earth in your scenario. Unless those people are selected for extreme levels of attachment to specific celestial bodies, as opposed to the function and benefit of those celestial bodies, I don’t see why those people would decide to not replace the sun with a better sun, and also get orders of magnitude richer by doing so.
It seems to me that the majority of those inhabitants of old earth would simply be people who don’t want to be uploaded (which is a much more common preference I expect than maintaining the literal sun in the sky) and so have much more limited ability to travel to other solar systems. I don’t see why I would want to condemn most people who don’t want be uploaded to relative cosmic poverty just because a very small minority of people want to keep burning away most of the usable energy in the solar system for historical reasons.
Are you arguing that if technologically possible, the Sun should be dismantled in the first few decades after the Singularity, as it is implied in the Great Transhumanist Future song, the main thing I’m complaining about here? In that case, I don’t know of any remotely just and reasonable (democratic, market-based or other) governance structure that would allow that to happen given how the majority of people feel.
If you are talking about population dynamics, ownership and voting shifting over millennia to the point that they decide to dismantle the Sun, then sure, that’s possible, though that’s not what I expect to happen, see my other comment on market trades and my reply to Habryka on population dynamics.
I think that the coder looking up and saying that the Sun burning is distasteful but the Great Transhumanist Future will come in 20 years, along with a later mention of “the Sun is a battery”, together implies that the Sun is getting dismantled in the near future. I guess you can debate in how strong the implication is, maybe they just want to dismantle the Sun in the long term, and currently only using the Sun as a battery in some benign way, but I think that’s not the most natural interpretation.
I have my own actual best guesses for what happens in reasonably good futures, which I can get into. (I’ll flag for now I think “preserve Earth itself for as long as possible” is a reasonable Schelling point that is compatible with many “go otherwise quite fast” plans)
Why do you doubt this? (to be clear, depends on exact details. But, my original query was about a 2 year delay. Proxima Centauri is 4 lightyears away. What is your story for how only taking 0.1% of the sun’s energy while we spin up doesn’t slow us down by at least 2 years?
I have more to say but maybe should wait on your answer to that.
Mostly, I think your last comment still had it’s own missing mood of horror, and/or seemed to be assuming away any tradeoffs.
(I am with you on “many rationalists seem gung ho about this in a way I find scary”)
The argument is (I assume):
Once centuries have passed, you’ve already sent out huge amounts of space probes that roughly saturate reachable resources. (Because you can convert Proxima Centauri fully into probes within <20 years probably.)
It doesn’t take that much energy to pretty much fully saturate on probes. In particular, Eternity in six hours claims getting the energy for most of the probes you want is possible with just 6 hours of solar output (let alone eating 0.1% of the sun). Even if we assume this off by 2 OOMs (e.g. to be confident you get everywhere you need), that still means we can saturate on energy after 1 month of solar output. If we’re willing to eat 0.1% of the sun (presumably at least millions of years of solar output?), the situation isn’t even close. In fact, the key bottleneck based on Eternity in six hours is disassembling mercury (I think on heat dissipation) though it is hard to be confident in advance.
Yes, I wanted to argue something like this.
I agree that I don’t viscerally feel the loss of the 200 galaxies, and maybe that’s a deficiency. But I still find this position crazy. I feel this is a decent parallel dialogue:
Other person: “Here is a something I thought of that would increase health outcomes in the world by 0.00000004%.”
Me: “But surely you realize that this measure is horrendously unpopular, and the only way to implement it is through a dictatorial world government.”
Other person: “Well yes, I agree it’s a hard dilemma, but on absolute terms, 0.00000004% of the world population is 3 people, so my intervention would save 3 lives. Think about how horrible tragedy every death is, I really feel there is a missing mood here when you don’t even consider the upside of my proposal.”
I do feel sad when the democratic governments of the world screw up policy in a big way according to my beliefs and values (as they often do). I still believe that democracy is the least bad form of government, but yes, I feel the temptation of how much better it would be if we were instead governed by perfect philosopher king who happens to be aligned with my values. But 0.00000004% is just a drop in the ocean.
Similarly, I believe that we should try to maintain a relatively democratic form of government at least in the early AI age (and then the people can slowly figure out if they find something better than democracy). And yes, I expect that democracies will do a lot of incredibly wasteful, stupid, and sometimes evil things by my lights, and I will sometimes wish if somehow I could have become the philosopher king. That’s just how things always are. But leaving Earth alone really is chump change, and won’t be among the top thousand things I disagree with democracies on.
(Also, again, I think there will also be a lot of value in conservatism and caution, especially that we probably won’t be able to fully trust our AIs in the most complicated issues. And I also think there is something intrinsically wrong about destroying Earth, I think that if you cut out the world’s oldest tree to increase health outcomes by 0.00000004%, you are doing something wrong, and people have a good reason to distrust you.)
Most decisions are not made democratically, and pointing out that a majoritarian vote is against a decision is no argument that they will not happen nor should not happen. This is true of the vast majority of resource allocation decisions such as how to divvy up physical materials.
I think it is an argument that they should not happen, but a very weak one, especially for things like this where the current public really hasn’t thought much about.
Also, I think David is just wildly wrong here about what realistically would happen in 10,000 years for a society that could actually start using all of the suns energy. This would involve hundreds of generations of people each deciding to not grow, to not expand into the rest of the solar system, to pass up on enormous opportunities for joy and greatness and creation, out of sentimentality for a specific kind of attachment for the specific arrangement of our planet and solar system at this moment in time. This attachment is very much real, and worth something, but IMO obviously will not remotely outweigh the preferences and actions of the trillions of people who will all want to do more things and have more things (and since we are talking about gradual expansion, very much present in the conversation).
I concede that I was mistaken in saying it was no argument; I will agree with the position that it is a very weak one and is often outweighed by other arguments.
Majority vote is useful specifically in determining who has power because of the extremely high level of adversarial dynamics, but in contexts that are not as wildly adversarial (including most specific decisions that an institution makes) generally other decision-making algorithms are better.
You mean that people on Earth and the solar system colonies will have enough biological children, and space travel to other stars for biological people will be hard enough that they will want the resources from dismantling the Sun? I suppose that’s possible, though I expect they will put some kind of population control for biological people in place before that happens. I agree that also feels aversive, but at some point it needs to be done anyway, otherwise exponential population growth just brings us back to the Malthusian limit a few ten thousand years from now even if we use up the whole Universe. (See Tim Underwood’s excellent rationalist sci-fi novel on the topic.)
If you are talking about ems and digital beings, not biological humans, I don’t think they will and should have have decision rights over what happens with the solar system, as they can simply move to other stars.
Someone will live on old earth in your scenario. Unless those people are selected for extreme levels of attachment to specific celestial bodies, as opposed to the function and benefit of those celestial bodies, I don’t see why those people would decide to not replace the sun with a better sun, and also get orders of magnitude richer by doing so.
It seems to me that the majority of those inhabitants of old earth would simply be people who don’t want to be uploaded (which is a much more common preference I expect than maintaining the literal sun in the sky) and so have much more limited ability to travel to other solar systems. I don’t see why I would want to condemn most people who don’t want be uploaded to relative cosmic poverty just because a very small minority of people want to keep burning away most of the usable energy in the solar system for historical reasons.
Are you arguing that if technologically possible, the Sun should be dismantled in the first few decades after the Singularity, as it is implied in the Great Transhumanist Future song, the main thing I’m complaining about here? In that case, I don’t know of any remotely just and reasonable (democratic, market-based or other) governance structure that would allow that to happen given how the majority of people feel.
If you are talking about population dynamics, ownership and voting shifting over millennia to the point that they decide to dismantle the Sun, then sure, that’s possible, though that’s not what I expect to happen, see my other comment on market trades and my reply to Habryka on population dynamics.
(It is not implied in the song, to be clear, you seem to have a reading of the lyrics I do not understand.
The song talks about there being a singularity in ~20 years, and separately that the sun is wasteful, but I don’t see any reference to the sun being dismantled in 20 years. For reference, lyrics are here: https://luminousalicorn.tumblr.com/post/175855775830/a-filk-of-big-rock-candy-mountain-one-evening-as)
I think that the coder looking up and saying that the Sun burning is distasteful but the Great Transhumanist Future will come in 20 years, along with a later mention of “the Sun is a battery”, together implies that the Sun is getting dismantled in the near future. I guess you can debate in how strong the implication is, maybe they just want to dismantle the Sun in the long term, and currently only using the Sun as a battery in some benign way, but I think that’s not the most natural interpretation.