So, I’m with you on “hey guys, uh, this is pretty horrifying, right? Uh, what’s with the missing mood about that?”.
The issue is that not-eating-the-sun is also horrifying. i.e. see also All Possible Views About Humanity’s Future Are Wild. To not eat the sun is to throw away orders of magnitude more resources than anyone has ever thrown away before. Is it percentage-wise “a small fraction of the cosmos?”. Sure. But, (quickly checks Claude, which wrote up a fermi code snippet before answering, I can share the work if you want to doublecheck yourself), a two year delay would be… 0.00000004% of the unvierse lost beyond the lightcone horizon, which doesn’t sound like much except that’s 200 galaxies lost.
When you compare “the Amish get a Sun Replica that doesn’t change their experience”, the question is “Is it worth throwing away 80 trillion stars for the Amish to have the real thing.” It does not seem obviously worth it.
IMO there isn’t an option that isn’t at least a bit horrifying in some sense that one could have a missing mood about. And while I still feel unsettled about it, I think if I have to grieve something, makes more sense to grieve in the direction of “don’t throw away 80 trillion stars worth of resources.”
I think you’re also maybe just not appreciating how much would change in 10,000 years? Like, there is no single culture that has survived 10,000 years. (Maybe one of those small tribes in the amazon? I’d still bet on there having been a lot of cultural drift there but not confidently). The Amish are only a few hundred years old. I can imagine doing a lot of moral reflection and coming to the conclusion the sun shouldn’t be eaten until all human cultures have decided it’s the right thing to do, but I do really doubt that process takes 10,000 years.
To not eat the sun is to throw away orders of magnitude more resources than anyone has ever thrown away before. Is it percentage-wise “a small fraction of the cosmos?”. Sure. But, (*quickly checks Claude, which wrote up a fermi code snippet before answering, I can share the work if you want to doublecheck yourself), *a two year delay would be… 0.00000004% of the unvierse lost beyond the lightcone horizon, which doesn’t sound like much except that’s 200 galaxies lost.
Why is this horrifying? Are we doing anything with those galaxies right now? What is this talk of “throwing away”, “lost”, etc.?
You speak as if we could be exploiting those galaxies at the extreme edge of the observable universe, like… tomorrow, or next week… if only we don’t carelessly lose them. Like we have these “resources” sitting around, at our disposal, as we speak. But of course nothing remotely like this is true. How long would it even take to reach any of these places? Billions of years, right? So the question is:
“Should we do something that might possibly somehow affect something that ‘we’, in some broad sense (because who even knows whether humanity will be around at the time, or in what form), will be doing several billion years from now, in order to avoid dismantling the Sun?”
Pretty obvious the answer is “duh, of course, this is a no-brainer, yes we should, are you even serious—billions of years, really?—clearly we should”.
I think you’re also maybe just not appreciating how much would change in 10,000 years? Like, there is no single culture that has survived 10,000 years.
You’re the one who’s talking about stuff billions of years from now, so this argument applies literally, like, a million times more to your position than to the one you’re arguing against!
In any case, “let’s not dismantle the Sun until and unless we all agree that it’s a good idea” seems reasonable. If the Amish (and people like me) come around to your view in 10 years, great, that’s when we’ll crank up the star-lifters. If we’re still opposed a million years from now, well, too bad—find another start to dismantle. (In fact, here’s an entire galaxy that probably won’t be missed.)
I think this is a false dilemma. If all human cultures on Earth come to the conclusion in 1000 years that they would like the Sun to be dismantled (which I very much doubt), then sure, we can do that. But at that point, we could already have built awesome industrial bases by dismantling Alpha Centauri, or just building them up by dismantling 0.1% of the Sun that doesn’t affect anything on Earth. I doubt that totally dismantling the Sun after centuries would significantly accelerate the time we reach the cosmic event horizon.
The thing that actually has costs is not to immediately bulldoze down Earth and turn it into a maximally efficient industrial powerhouse at the cost of killing every biological body. Or if the ASI has the opportunity to dismantle the Sun in a short notice (the post alludes to 10,000 years being a very conservative estimate and “the assumption that ASI eats the Sun within a few years”). But that’s not going to happen democratically. There is no way you get 51% of people [1] to vote for bulldozing down Earth and killing their biological bodies, and I very much doubt you get that vote even for dismantling the Sun in a few years and putting some fake sky around Earth for protection. It’s possible there could be a truly wise philosopher king who could with aching heart overrule everyone else’s objection and bulldoze down Earth to get those extra 200 galaxies at the edge of the Universe, but then govern the Universe wisely and benevolently in a way that people on reflection all approve of. But in practice, we are not going to get a wise philosopher king. I expect that any government that decides to destroy the Sun for the greater good, against the outrage of the vast majority of people, will also be a bad ruler of the Universe.
I also believe that AI alignment is not a binary, and even in the worlds where there is no AI takeover, we will probably get an AI we initially can’t fully tust that will follow the spirit of our commands in exotic situations we can’t really understand. In that case, it would be extremely unwise to immediately instruct it to create mind uploads (how faithful those will be?) and bulldoze down the world to turn the Sun into computronium. There are a lot of reasons for taking things slow.
Usually rationalists are pretty reasonable about these things, and endorse democratic government and human rights, and they even often like talking about the Long Reflection a taking things slow. But then they start talking about dismantling the Sun! This post can kind of defend itself that it was proposing a less immediate and horrifying implementation (though there really is a missing mood here), but there are other examples, most notably the Great Transhumanist Future song in last year’s Solstice, where a coder looks up to the burning Sun disapprovingly, and in twenty years with a big ol’ computer they will use the Sun as a battery.
I don’t know if the people talking like that are so out of touch that they believe that with a little convincing everyone will agree to dismantle the Sun in twenty years, or they would approve of an AI-enabled dictatorship bulldozing over Earth, or they just don’t think through the implications. I think it’s mostly that they just don’t think about it too hard, but I did hear people coming out in favor of actually bulldozing down Earth (usually including a step where we forcibly increase everyone’s intelligence until they agree with the leadership), and I think that’s very foolish and bad.
I have my own actual best guesses for what happens in reasonably good futures, which I can get into. (I’ll flag for now I think “preserve Earth itself for as long as possible” is a reasonable Schelling point that is compatible with many “go otherwise quite fast” plans)
I doubt that totally dismantling the Sun after centuries would significantly accelerate the time we reach the cosmic event horizon.
Why do you doubt this? (to be clear, depends on exact details. But, my original query was about a 2 year delay. Proxima Centauri is 4 lightyears away. What is your story for how only taking 0.1% of the sun’s energy while we spin up doesn’t slow us down by at least 2 years?
I have more to say but maybe should wait on your answer to that.
Mostly, I think your last comment still had it’s own missing mood of horror, and/or seemed to be assuming away any tradeoffs.
(I am with you on “many rationalists seem gung ho about this in a way I find scary”)
Once centuries have passed, you’ve already sent out huge amounts of space probes that roughly saturate reachable resources. (Because you can convert Proxima Centauri fully into probes within <20 years probably.)
It doesn’t take that much energy to pretty much fully saturate on probes. In particular, Eternity in six hours claims getting the energy for most of the probes you want is possible with just 6 hours of solar output (let alone eating 0.1% of the sun). Even if we assume this off by 2 OOMs (e.g. to be confident you get everywhere you need), that still means we can saturate on energy after 1 month of solar output. If we’re willing to eat 0.1% of the sun (presumably at least millions of years of solar output?), the situation isn’t even close. In fact, the key bottleneck based on Eternity in six hours is disassembling mercury (I think on heat dissipation) though it is hard to be confident in advance.
I agree that I don’t viscerally feel the loss of the 200 galaxies, and maybe that’s a deficiency. But I still find this position crazy. I feel this is a decent parallel dialogue: Other person: “Here is a something I thought of that would increase health outcomes in the world by 0.00000004%.” Me: “But surely you realize that this measure is horrendously unpopular, and the only way to implement it is through a dictatorial world government.” Other person: “Well yes, I agree it’s a hard dilemma, but on absolute terms, 0.00000004% of the world population is 3 people, so my intervention would save 3 lives. Think about how horrible tragedy every death is, I really feel there is a missing mood here when you don’t even consider the upside of my proposal.”
I do feel sad when the democratic governments of the world screw up policy in a big way according to my beliefs and values (as they often do). I still believe that democracy is the least bad form of government, but yes, I feel the temptation of how much better it would be if we were instead governed by perfect philosopher king who happens to be aligned with my values. But 0.00000004% is just a drop in the ocean.
Similarly, I believe that we should try to maintain a relatively democratic form of government at least in the early AI age (and then the people can slowly figure out if they find something better than democracy). And yes, I expect that democracies will do a lot of incredibly wasteful, stupid, and sometimes evil things by my lights, and I will sometimes wish if somehow I could have become the philosopher king. That’s just how things always are. But leaving Earth alone really is chump change, and won’t be among the top thousand things I disagree with democracies on.
(Also, again, I think there will also be a lot of value in conservatism and caution, especially that we probably won’t be able to fully trust our AIs in the most complicated issues. And I also think there is something intrinsically wrong about destroying Earth, I think that if you cut out the world’s oldest tree to increase health outcomes by 0.00000004%, you are doing something wrong, and people have a good reason to distrust you.)
Most decisions are not made democratically, and pointing out that a majoritarian vote is against a decision is no argument that they will not happen nor should not happen. This is true of the vast majority of resource allocation decisions such as how to divvy up physical materials.
I think it is an argument that they should not happen, but a very weak one, especially for things like this where the current public really hasn’t thought much about.
Also, I think David is just wildly wrong here about what realistically would happen in 10,000 years for a society that could actually start using all of the sun’s energy. This would involve hundreds of generations of people each deciding to not grow, to not expand into the rest of the solar system, to pass up on enormous opportunities for joy and greatness and creation, out of sentimentality for a specific kind of attachment for the specific arrangement of our planet and solar system at this moment in time. This attachment is very much real, and worth something, but IMO obviously will not remotely outweigh the preferences and actions of the trillions of people who will all want to do more things and have more things (and since we are talking about gradual expansion, very much present in the conversation).
I concede that I was mistaken in saying it was no argument; I will agree with the position that it is a very weak one and is often outweighed by other arguments.
Majority vote is useful specifically in determining who has power because of the extremely high level of adversarial dynamics, but in contexts that are not as wildly adversarial (including most specific decisions that an institution makes) generally other decision-making algorithms are better.
You mean that people on Earth and the solar system colonies will have enough biological children, and space travel to other stars for biological people will be hard enough that they will want the resources from dismantling the Sun? I suppose that’s possible, though I expect they will put some kind of population control for biological people in place before that happens. I agree that also feels aversive, but at some point it needs to be done anyway, otherwise exponential population growth just brings us back to the Malthusian limit a few ten thousand years from now even if we use up the whole Universe. (See Tim Underwood’s excellent rationalist sci-fi novel on the topic.)
If you are talking about ems and digital beings, not biological humans, I don’t think they will and should have have decision rights over what happens with the solar system, as they can simply move to other stars.
Someone will live on old earth in your scenario. Unless those people are selected for extreme levels of attachment to specific celestial bodies, as opposed to the function and benefit of those celestial bodies, I don’t see why those people would decide to not replace the sun with a better sun, and also get orders of magnitude richer by doing so.
It seems to me that the majority of those inhabitants of old earth would simply be people who don’t want to be uploaded (which is a much more common preference I expect than maintaining the literal sun in the sky) and so have much more limited ability to travel to other solar systems. I don’t see why I would want to condemn most people who don’t want be uploaded to relative cosmic poverty just because a very small minority of people want to keep burning away most of the usable energy in the solar system for historical reasons.
Unless those people are selected for extreme levels of attachment to specific celestial bodies, as opposed to the function and benefit of those celestial bodies, I don’t see why those people would decide to not replace the sun with a better sun, and also get orders of magnitude richer by doing so.
Because hopefully those people will include, and (depending on population control) might indeed be overwhelmingly composed of, the current, pre-singularity population of Earth. I don’t think a majority of currently-alive humans would ever agree to destroy the Sun, and that includes being unwilling to self-modify into minds that would agree to destroy the Sun.
Raemon spoke upthread about how “no single culture that has survived 10,000 years”, but that was in a world with mortality.
Are you arguing that if technologically possible, the Sun should be dismantled in the first few decades after the Singularity, as it is implied in the Great Transhumanist Future song, the main thing I’m complaining about here? In that case, I don’t know of any remotely just and reasonable (democratic, market-based or other) governance structure that would allow that to happen given how the majority of people feel.
If you are talking about population dynamics, ownership and voting shifting over millennia to the point that they decide to dismantle the Sun, then sure, that’s possible, though that’s not what I expect to happen, see my other comment on market trades and my reply to Habryka on population dynamics.
I think that the coder looking up and saying that the Sun burning is distasteful but the Great Transhumanist Future will come in 20 years, along with a later mention of “the Sun is a battery”, together implies that the Sun is getting dismantled in the near future. I guess you can debate in how strong the implication is, maybe they just want to dismantle the Sun in the long term, and currently only using the Sun as a battery in some benign way, but I think that’s not the most natural interpretation.
I think the 20 years somewhat unambiguously refers to timelines until AGI is built.
Separately, “the sun is a battery” I think also doesn’t really imply anything about the sun getting dismantled, if anything it seems to me imply explicitly that the sun is still intact (and probably surrounded by a Dyson swarm or sphere).
It sounds like there’s actually like 3-5 different object level places where we’re talking about slightly different things. I also updated on the practical aspect from Ryan’s comment. So, idk here’s a bunch of distinct points.
1.
Ryan Greenblatt’s comment updated me that the energy requirements here are minimal enough that “eating the sun” isn’t really going to come up as a consideration for astronomical waste. (Eating the Earth or most of the solar system seems like it still might be. But, I agree we shouldn’t Eat the Earth)
2.
I’d interpreted most past comments for nearterm (i.e. measured in decades) crazy shit to be about building Dyson spheres, not Star Lifting. (i.e. I expected the ’20 years from now in some big ol’ computer’ in the solstice song to be about dyson spheres and voluntary uploads). I think many people will still freak out about Dyson Sphering the sun (not sure if you would). I would personally argue “it’s just pretty damn important to Dyson Sphere the sun even if it makes people uncomfortable (while designing it such that Earth still gets enough light).”
3.
I agree in 1000 years it won’t much matter whether you Starlift, for astronomical waste reasons. But I do expect in 1000 years, even assuming a maximally consent-oriented / conservative-with-regards-to-bio-human-values, and all around “good” outcome, most people will have shifted to running on computronium and experienced much more than 1000 years of subjective time and their intuitions about what’s good will just be real different. There may be small groups of people who continue living in bio-world but most of them will still probably be pretty alien by our lights.
I think I do personally hope they preserve the Earth as sanctuary and/or historical relic. But I think there’s a lot of compromises like “starlift a lot of material out of the sun, but move the Earth closer to the sun to compensate” (I haven’t looked into the physics here, the details are obviously cruxy).
When I imagine any kind of actual realistic future that isn’t maximally conservative (i.e. the bio humans are < .1% of the solar system’s population and just don’t have that much bargaining power), it seems even more likely that they’ll at least work on compromise solutions that preserve a reasonable Earth experience but eat a bunch of the sun, if there turn out to be serious tradeoffs there. (Again I don’t actually know enough physics here and I’m recently humbled by remembering the Eternity in Six Hours paper, maybe there’s literally no tradeoffs here, but, I’d still doubt it)
4.
It sounds like it’s not particularly cruxy anymore, but, I think the “0.00000004% of the Earth’s current population” analogy is just quite different. 80 trillion suns is involves more value than has ever been had before, 3 lives is (relatively) insignificant compared to many political compromises we’ve made, even going back thousands of years. Maybe whatever descendants get to reap that value are so alien that they just don’t count as valuable by today’s lights, and it’s reasonable to have some extreme time discounting here, but, if any values-you-care-about survived it would be huge.
I agree both morally and practically with “it’s way more important to make sure we have good global coordination systems that don’t predictably either descend into a horrible totalitarianism, or trigger a race for power that causes horrible wars or other bad things, than to get those 80 trillion suns.” But, like, the 80 trillion suns are still a big deal.
5.
I’ll note it’s also not a boolean whether we “bulldoze the earth” or “bulldoze the rest of the solar system” for rushing to build a dyson sphere. You can start the process with a bunch of mining in some remote mountain regions or whatever without eating the whole earth. (But I think it might be bad to do this because “don’t harvest Earth” is just a nice simple Schelling rule and once you start haggling over the details I do get a lot more worried)
6.
I recall reading it’s actually maybe cheaper to use asteroids than Mercury to make a dyson sphere because you don’t need to expensively lift things out of the gravity well. It is appealing to me if there are no tradeoffs involved with deconstructing any of the charismatic astronomical objects until we’ve had more time to think/orient/grow-as-a-people.
7.
Part of my outlook here is that I spend the last 14 years being personally uninterested in and scared by the sorts of rapid/crazy/exponential change you’re wary of. In the past few years, I’ve adjusted to be more personally into it. I don’t think I would have wanted to rush that grieving/orienting process for Past Me even though it cost me a lot of important time and resources (I’m referring here more to more like stuff in The God of Humanity, and the God of the Robot Utilitarians)
But I do wish I had somehow sped along the non-soulfully-traumatic parts of the process (i.e. some of the updates were more simple/straightforward and if someone had said the right words to me, I think I’d have gotten a strictly better outcome by my original lights).
I expect most humans, given opportunity to experiment on their own terms, will gradually have some kind of perspective shift here (maybe on a longer timescale than Me Among the Rationalists, but, like, <500 years). I don’t want people to feel rushed about it, but I think there will be some societal structures that will lend themselves to dallying more and accumulating serious tradeoffs, or less.
I feel reassured that you don’t want to Eat the Earth while there are still biological humans who want to live on it.
I still maintain that under governance systems I would like, I would expect the outcome to be very conservative with the solar system in the next thousand years. Like one default governance structure I quite like is to parcel out the Universe equally among the people alive during the Singularity, have a binding constitution on what they can do on their fiefdoms (no torture, etc), and allow them to trade and give away their stuff to their biological and digital descendants. There could also be a basic income coming to all biological people,[1] though not to digital as it’s too easy to mass-produce them.
One year of delay in cosmic expansion costs us around 1 in a billion of the reachable Universe under some assumptions on where the grabby aliens are (if they exist). One year also costs us around 1 in a billion of the Sun’s mass being burned, if like Habryka you care about using the solar system optimally for the sake of the biological humans who want to stay. So one year of delay can be bought by 160 people paying out 10% of their wealth. I really think that you won’t do things like moving the Earth closer to the Sun and things like that in the next 200 years, there will just always be enough people to pay out, it just takes 10,000 traditionalist families, literally the Amish could easily do it. And it won’t matter much, the cosmic acceleration will soon become a moot point as we build out other industrial bases, and I don’t expect the biological people to feel much of a personal need to dismantle the Sun anytime soon. Maybe in 10,000 years the objectors will run out of money, and the bio people either overpopulate or have expensive hobbies like building planets to themselves and decide to dismantle the Sun, though I expect them to be rich enough to just haul in matter from other stars if they want to.
By the way, I recommend Tim Underwood’s sci-fi, The Accord, as a very good exploration of these topics, I think it’s my favorite sci-fi novel.
As for the 80 trillions stars, I agree it’s a real loss, but for me this type of sadness feels “already priced in”. I already accepted that the world won’t and shouldn’t be all my personal absolute kingdom, so other people’s decision will cause a lot of waste from my perspective, and 0.00000004% is just a really negligible part of this loss. In this, I think my analogy to current government is quite apt, I feel similarly about current governments, that I already accepted that the world will be wasteful compared to the rule of a dictatorship perfectly aligned with me, but that’s how it needs to be.
Though you need to pay attention to overpopulation. If the average biological couple has 2.2 children, the Universe runs out of atoms to support humans in 50 thousand years. Exponential growth is crazy fast.
So, I’m with you on “hey guys, uh, this is pretty horrifying, right? Uh, what’s with the missing mood about that?”.
The issue is that not-eating-the-sun is also horrifying. i.e. see also All Possible Views About Humanity’s Future Are Wild. To not eat the sun is to throw away orders of magnitude more resources than anyone has ever thrown away before. Is it percentage-wise “a small fraction of the cosmos?”. Sure. But, (quickly checks Claude, which wrote up a fermi code snippet before answering, I can share the work if you want to doublecheck yourself), a two year delay would be… 0.00000004% of the unvierse lost beyond the lightcone horizon, which doesn’t sound like much except that’s 200 galaxies lost.
When you compare “the Amish get a Sun Replica that doesn’t change their experience”, the question is “Is it worth throwing away 80 trillion stars for the Amish to have the real thing.” It does not seem obviously worth it.
IMO there isn’t an option that isn’t at least a bit horrifying in some sense that one could have a missing mood about. And while I still feel unsettled about it, I think if I have to grieve something, makes more sense to grieve in the direction of “don’t throw away 80 trillion stars worth of resources.”
I think you’re also maybe just not appreciating how much would change in 10,000 years? Like, there is no single culture that has survived 10,000 years. (Maybe one of those small tribes in the amazon? I’d still bet on there having been a lot of cultural drift there but not confidently). The Amish are only a few hundred years old. I can imagine doing a lot of moral reflection and coming to the conclusion the sun shouldn’t be eaten until all human cultures have decided it’s the right thing to do, but I do really doubt that process takes 10,000 years.
Why is this horrifying? Are we doing anything with those galaxies right now? What is this talk of “throwing away”, “lost”, etc.?
You speak as if we could be exploiting those galaxies at the extreme edge of the observable universe, like… tomorrow, or next week… if only we don’t carelessly lose them. Like we have these “resources” sitting around, at our disposal, as we speak. But of course nothing remotely like this is true. How long would it even take to reach any of these places? Billions of years, right? So the question is:
“Should we do something that might possibly somehow affect something that ‘we’, in some broad sense (because who even knows whether humanity will be around at the time, or in what form), will be doing several billion years from now, in order to avoid dismantling the Sun?”
Pretty obvious the answer is “duh, of course, this is a no-brainer, yes we should, are you even serious—billions of years, really?—clearly we should”.
You’re the one who’s talking about stuff billions of years from now, so this argument applies literally, like, a million times more to your position than to the one you’re arguing against!
In any case, “let’s not dismantle the Sun until and unless we all agree that it’s a good idea” seems reasonable. If the Amish (and people like me) come around to your view in 10 years, great, that’s when we’ll crank up the star-lifters. If we’re still opposed a million years from now, well, too bad—find another start to dismantle. (In fact, here’s an entire galaxy that probably won’t be missed.)
When personal life expectancy of these same people alive today is something like 1e34 years, billions of years is very little.
I don’t think that this is true.
I think this is a false dilemma. If all human cultures on Earth come to the conclusion in 1000 years that they would like the Sun to be dismantled (which I very much doubt), then sure, we can do that. But at that point, we could already have built awesome industrial bases by dismantling Alpha Centauri, or just building them up by dismantling 0.1% of the Sun that doesn’t affect anything on Earth. I doubt that totally dismantling the Sun after centuries would significantly accelerate the time we reach the cosmic event horizon.
The thing that actually has costs is not to immediately bulldoze down Earth and turn it into a maximally efficient industrial powerhouse at the cost of killing every biological body. Or if the ASI has the opportunity to dismantle the Sun in a short notice (the post alludes to 10,000 years being a very conservative estimate and “the assumption that ASI eats the Sun within a few years”). But that’s not going to happen democratically. There is no way you get 51% of people [1] to vote for bulldozing down Earth and killing their biological bodies, and I very much doubt you get that vote even for dismantling the Sun in a few years and putting some fake sky around Earth for protection. It’s possible there could be a truly wise philosopher king who could with aching heart overrule everyone else’s objection and bulldoze down Earth to get those extra 200 galaxies at the edge of the Universe, but then govern the Universe wisely and benevolently in a way that people on reflection all approve of. But in practice, we are not going to get a wise philosopher king. I expect that any government that decides to destroy the Sun for the greater good, against the outrage of the vast majority of people, will also be a bad ruler of the Universe.
I also believe that AI alignment is not a binary, and even in the worlds where there is no AI takeover, we will probably get an AI we initially can’t fully tust that will follow the spirit of our commands in exotic situations we can’t really understand. In that case, it would be extremely unwise to immediately instruct it to create mind uploads (how faithful those will be?) and bulldoze down the world to turn the Sun into computronium. There are a lot of reasons for taking things slow.
Usually rationalists are pretty reasonable about these things, and endorse democratic government and human rights, and they even often like talking about the Long Reflection a taking things slow. But then they start talking about dismantling the Sun! This post can kind of defend itself that it was proposing a less immediate and horrifying implementation (though there really is a missing mood here), but there are other examples, most notably the Great Transhumanist Future song in last year’s Solstice, where a coder looks up to the burning Sun disapprovingly, and in twenty years with a big ol’ computer they will use the Sun as a battery.
I don’t know if the people talking like that are so out of touch that they believe that with a little convincing everyone will agree to dismantle the Sun in twenty years, or they would approve of an AI-enabled dictatorship bulldozing over Earth, or they just don’t think through the implications. I think it’s mostly that they just don’t think about it too hard, but I did hear people coming out in favor of actually bulldozing down Earth (usually including a step where we forcibly increase everyone’s intelligence until they agree with the leadership), and I think that’s very foolish and bad.
And even 51% of the vote wouldn’t be enough in any good democracy to bulldoze over everyone else
I have my own actual best guesses for what happens in reasonably good futures, which I can get into. (I’ll flag for now I think “preserve Earth itself for as long as possible” is a reasonable Schelling point that is compatible with many “go otherwise quite fast” plans)
Why do you doubt this? (to be clear, depends on exact details. But, my original query was about a 2 year delay. Proxima Centauri is 4 lightyears away. What is your story for how only taking 0.1% of the sun’s energy while we spin up doesn’t slow us down by at least 2 years?
I have more to say but maybe should wait on your answer to that.
Mostly, I think your last comment still had it’s own missing mood of horror, and/or seemed to be assuming away any tradeoffs.
(I am with you on “many rationalists seem gung ho about this in a way I find scary”)
The argument is (I assume):
Once centuries have passed, you’ve already sent out huge amounts of space probes that roughly saturate reachable resources. (Because you can convert Proxima Centauri fully into probes within <20 years probably.)
It doesn’t take that much energy to pretty much fully saturate on probes. In particular, Eternity in six hours claims getting the energy for most of the probes you want is possible with just 6 hours of solar output (let alone eating 0.1% of the sun). Even if we assume this off by 2 OOMs (e.g. to be confident you get everywhere you need), that still means we can saturate on energy after 1 month of solar output. If we’re willing to eat 0.1% of the sun (presumably at least millions of years of solar output?), the situation isn’t even close. In fact, the key bottleneck based on Eternity in six hours is disassembling mercury (I think on heat dissipation) though it is hard to be confident in advance.
Yes, I wanted to argue something like this.
I agree that I don’t viscerally feel the loss of the 200 galaxies, and maybe that’s a deficiency. But I still find this position crazy. I feel this is a decent parallel dialogue:
Other person: “Here is a something I thought of that would increase health outcomes in the world by 0.00000004%.”
Me: “But surely you realize that this measure is horrendously unpopular, and the only way to implement it is through a dictatorial world government.”
Other person: “Well yes, I agree it’s a hard dilemma, but on absolute terms, 0.00000004% of the world population is 3 people, so my intervention would save 3 lives. Think about how horrible tragedy every death is, I really feel there is a missing mood here when you don’t even consider the upside of my proposal.”
I do feel sad when the democratic governments of the world screw up policy in a big way according to my beliefs and values (as they often do). I still believe that democracy is the least bad form of government, but yes, I feel the temptation of how much better it would be if we were instead governed by perfect philosopher king who happens to be aligned with my values. But 0.00000004% is just a drop in the ocean.
Similarly, I believe that we should try to maintain a relatively democratic form of government at least in the early AI age (and then the people can slowly figure out if they find something better than democracy). And yes, I expect that democracies will do a lot of incredibly wasteful, stupid, and sometimes evil things by my lights, and I will sometimes wish if somehow I could have become the philosopher king. That’s just how things always are. But leaving Earth alone really is chump change, and won’t be among the top thousand things I disagree with democracies on.
(Also, again, I think there will also be a lot of value in conservatism and caution, especially that we probably won’t be able to fully trust our AIs in the most complicated issues. And I also think there is something intrinsically wrong about destroying Earth, I think that if you cut out the world’s oldest tree to increase health outcomes by 0.00000004%, you are doing something wrong, and people have a good reason to distrust you.)
Most decisions are not made democratically, and pointing out that a majoritarian vote is against a decision is no argument that they will not happen nor should not happen. This is true of the vast majority of resource allocation decisions such as how to divvy up physical materials.
I think it is an argument that they should not happen, but a very weak one, especially for things like this where the current public really hasn’t thought much about.
Also, I think David is just wildly wrong here about what realistically would happen in 10,000 years for a society that could actually start using all of the sun’s energy. This would involve hundreds of generations of people each deciding to not grow, to not expand into the rest of the solar system, to pass up on enormous opportunities for joy and greatness and creation, out of sentimentality for a specific kind of attachment for the specific arrangement of our planet and solar system at this moment in time. This attachment is very much real, and worth something, but IMO obviously will not remotely outweigh the preferences and actions of the trillions of people who will all want to do more things and have more things (and since we are talking about gradual expansion, very much present in the conversation).
I concede that I was mistaken in saying it was no argument; I will agree with the position that it is a very weak one and is often outweighed by other arguments.
Majority vote is useful specifically in determining who has power because of the extremely high level of adversarial dynamics, but in contexts that are not as wildly adversarial (including most specific decisions that an institution makes) generally other decision-making algorithms are better.
You mean that people on Earth and the solar system colonies will have enough biological children, and space travel to other stars for biological people will be hard enough that they will want the resources from dismantling the Sun? I suppose that’s possible, though I expect they will put some kind of population control for biological people in place before that happens. I agree that also feels aversive, but at some point it needs to be done anyway, otherwise exponential population growth just brings us back to the Malthusian limit a few ten thousand years from now even if we use up the whole Universe. (See Tim Underwood’s excellent rationalist sci-fi novel on the topic.)
If you are talking about ems and digital beings, not biological humans, I don’t think they will and should have have decision rights over what happens with the solar system, as they can simply move to other stars.
Someone will live on old earth in your scenario. Unless those people are selected for extreme levels of attachment to specific celestial bodies, as opposed to the function and benefit of those celestial bodies, I don’t see why those people would decide to not replace the sun with a better sun, and also get orders of magnitude richer by doing so.
It seems to me that the majority of those inhabitants of old earth would simply be people who don’t want to be uploaded (which is a much more common preference I expect than maintaining the literal sun in the sky) and so have much more limited ability to travel to other solar systems. I don’t see why I would want to condemn most people who don’t want be uploaded to relative cosmic poverty just because a very small minority of people want to keep burning away most of the usable energy in the solar system for historical reasons.
Because hopefully those people will include, and (depending on population control) might indeed be overwhelmingly composed of, the current, pre-singularity population of Earth. I don’t think a majority of currently-alive humans would ever agree to destroy the Sun, and that includes being unwilling to self-modify into minds that would agree to destroy the Sun.
Raemon spoke upthread about how “no single culture that has survived 10,000 years”, but that was in a world with mortality.
Are you arguing that if technologically possible, the Sun should be dismantled in the first few decades after the Singularity, as it is implied in the Great Transhumanist Future song, the main thing I’m complaining about here? In that case, I don’t know of any remotely just and reasonable (democratic, market-based or other) governance structure that would allow that to happen given how the majority of people feel.
If you are talking about population dynamics, ownership and voting shifting over millennia to the point that they decide to dismantle the Sun, then sure, that’s possible, though that’s not what I expect to happen, see my other comment on market trades and my reply to Habryka on population dynamics.
(It is not implied in the song, to be clear, you seem to have a reading of the lyrics I do not understand.
The song talks about there being a singularity in ~20 years, and separately that the sun is wasteful, but I don’t see any reference to the sun being dismantled in 20 years. For reference, lyrics are here: https://luminousalicorn.tumblr.com/post/175855775830/a-filk-of-big-rock-candy-mountain-one-evening-as)
I think that the coder looking up and saying that the Sun burning is distasteful but the Great Transhumanist Future will come in 20 years, along with a later mention of “the Sun is a battery”, together implies that the Sun is getting dismantled in the near future. I guess you can debate in how strong the implication is, maybe they just want to dismantle the Sun in the long term, and currently only using the Sun as a battery in some benign way, but I think that’s not the most natural interpretation.
I think the 20 years somewhat unambiguously refers to timelines until AGI is built.
Separately, “the sun is a battery” I think also doesn’t really imply anything about the sun getting dismantled, if anything it seems to me imply explicitly that the sun is still intact (and probably surrounded by a Dyson swarm or sphere).
It sounds like there’s actually like 3-5 different object level places where we’re talking about slightly different things. I also updated on the practical aspect from Ryan’s comment. So, idk here’s a bunch of distinct points.
1.
Ryan Greenblatt’s comment updated me that the energy requirements here are minimal enough that “eating the sun” isn’t really going to come up as a consideration for astronomical waste. (Eating the Earth or most of the solar system seems like it still might be. But, I agree we shouldn’t Eat the Earth)
2.
I’d interpreted most past comments for nearterm (i.e. measured in decades) crazy shit to be about building Dyson spheres, not Star Lifting. (i.e. I expected the ’20 years from now in some big ol’ computer’ in the solstice song to be about dyson spheres and voluntary uploads). I think many people will still freak out about Dyson Sphering the sun (not sure if you would). I would personally argue “it’s just pretty damn important to Dyson Sphere the sun even if it makes people uncomfortable (while designing it such that Earth still gets enough light).”
3.
I agree in 1000 years it won’t much matter whether you Starlift, for astronomical waste reasons. But I do expect in 1000 years, even assuming a maximally consent-oriented / conservative-with-regards-to-bio-human-values, and all around “good” outcome, most people will have shifted to running on computronium and experienced much more than 1000 years of subjective time and their intuitions about what’s good will just be real different. There may be small groups of people who continue living in bio-world but most of them will still probably be pretty alien by our lights.
I think I do personally hope they preserve the Earth as sanctuary and/or historical relic. But I think there’s a lot of compromises like “starlift a lot of material out of the sun, but move the Earth closer to the sun to compensate” (I haven’t looked into the physics here, the details are obviously cruxy).
When I imagine any kind of actual realistic future that isn’t maximally conservative (i.e. the bio humans are < .1% of the solar system’s population and just don’t have that much bargaining power), it seems even more likely that they’ll at least work on compromise solutions that preserve a reasonable Earth experience but eat a bunch of the sun, if there turn out to be serious tradeoffs there. (Again I don’t actually know enough physics here and I’m recently humbled by remembering the Eternity in Six Hours paper, maybe there’s literally no tradeoffs here, but, I’d still doubt it)
4.
It sounds like it’s not particularly cruxy anymore, but, I think the “0.00000004% of the Earth’s current population” analogy is just quite different. 80 trillion suns is involves more value than has ever been had before, 3 lives is (relatively) insignificant compared to many political compromises we’ve made, even going back thousands of years. Maybe whatever descendants get to reap that value are so alien that they just don’t count as valuable by today’s lights, and it’s reasonable to have some extreme time discounting here, but, if any values-you-care-about survived it would be huge.
I agree both morally and practically with “it’s way more important to make sure we have good global coordination systems that don’t predictably either descend into a horrible totalitarianism, or trigger a race for power that causes horrible wars or other bad things, than to get those 80 trillion suns.” But, like, the 80 trillion suns are still a big deal.
5.
I’ll note it’s also not a boolean whether we “bulldoze the earth” or “bulldoze the rest of the solar system” for rushing to build a dyson sphere. You can start the process with a bunch of mining in some remote mountain regions or whatever without eating the whole earth. (But I think it might be bad to do this because “don’t harvest Earth” is just a nice simple Schelling rule and once you start haggling over the details I do get a lot more worried)
6.
I recall reading it’s actually maybe cheaper to use asteroids than Mercury to make a dyson sphere because you don’t need to expensively lift things out of the gravity well. It is appealing to me if there are no tradeoffs involved with deconstructing any of the charismatic astronomical objects until we’ve had more time to think/orient/grow-as-a-people.
7.
Part of my outlook here is that I spend the last 14 years being personally uninterested in and scared by the sorts of rapid/crazy/exponential change you’re wary of. In the past few years, I’ve adjusted to be more personally into it. I don’t think I would have wanted to rush that grieving/orienting process for Past Me even though it cost me a lot of important time and resources (I’m referring here more to more like stuff in The God of Humanity, and the God of the Robot Utilitarians)
But I do wish I had somehow sped along the non-soulfully-traumatic parts of the process (i.e. some of the updates were more simple/straightforward and if someone had said the right words to me, I think I’d have gotten a strictly better outcome by my original lights).
I expect most humans, given opportunity to experiment on their own terms, will gradually have some kind of perspective shift here (maybe on a longer timescale than Me Among the Rationalists, but, like, <500 years). I don’t want people to feel rushed about it, but I think there will be some societal structures that will lend themselves to dallying more and accumulating serious tradeoffs, or less.
I feel reassured that you don’t want to Eat the Earth while there are still biological humans who want to live on it.
I still maintain that under governance systems I would like, I would expect the outcome to be very conservative with the solar system in the next thousand years. Like one default governance structure I quite like is to parcel out the Universe equally among the people alive during the Singularity, have a binding constitution on what they can do on their fiefdoms (no torture, etc), and allow them to trade and give away their stuff to their biological and digital descendants. There could also be a basic income coming to all biological people,[1] though not to digital as it’s too easy to mass-produce them.
One year of delay in cosmic expansion costs us around 1 in a billion of the reachable Universe under some assumptions on where the grabby aliens are (if they exist). One year also costs us around 1 in a billion of the Sun’s mass being burned, if like Habryka you care about using the solar system optimally for the sake of the biological humans who want to stay. So one year of delay can be bought by 160 people paying out 10% of their wealth. I really think that you won’t do things like moving the Earth closer to the Sun and things like that in the next 200 years, there will just always be enough people to pay out, it just takes 10,000 traditionalist families, literally the Amish could easily do it. And it won’t matter much, the cosmic acceleration will soon become a moot point as we build out other industrial bases, and I don’t expect the biological people to feel much of a personal need to dismantle the Sun anytime soon. Maybe in 10,000 years the objectors will run out of money, and the bio people either overpopulate or have expensive hobbies like building planets to themselves and decide to dismantle the Sun, though I expect them to be rich enough to just haul in matter from other stars if they want to.
By the way, I recommend Tim Underwood’s sci-fi, The Accord, as a very good exploration of these topics, I think it’s my favorite sci-fi novel.
As for the 80 trillions stars, I agree it’s a real loss, but for me this type of sadness feels “already priced in”. I already accepted that the world won’t and shouldn’t be all my personal absolute kingdom, so other people’s decision will cause a lot of waste from my perspective, and 0.00000004% is just a really negligible part of this loss. In this, I think my analogy to current government is quite apt, I feel similarly about current governments, that I already accepted that the world will be wasteful compared to the rule of a dictatorship perfectly aligned with me, but that’s how it needs to be.
Though you need to pay attention to overpopulation. If the average biological couple has 2.2 children, the Universe runs out of atoms to support humans in 50 thousand years. Exponential growth is crazy fast.