So, I’m with you on “hey guys, uh, this is pretty horrifying, right? Uh, what’s with the missing mood about that?”.
The issue is that not-eating-the-sun is also horrifying. i.e. see also All Possible Views About Humanity’s Future Are Wild. To not eat the sun is to throw away orders of magnitude more resources than anyone has ever thrown away before. Is it percentage-wise “a small fraction of the cosmos?”. Sure. But, (quickly checks Claude, which wrote up a fermi code snippet before answering, I can share the work if you want to doublecheck yourself), a two year delay would be… 0.00000004% of the unvierse lost beyond the lightcone horizon, which doesn’t sound like much except that’s 200 galaxies lost.
When you compare “the Amish get a Sun Replica that doesn’t change their experience”, the question is “Is it worth throwing away 80 trillion stars for the Amish to have the real thing.” It does not seem obviously worth it.
IMO there isn’t an option that isn’t at least a bit horrifying in some sense that one could have a missing mood about. And while I still feel unsettled about it, I think if I have to grieve something, makes more sense to grieve in the direction of “don’t throw away 80 trillion stars worth of resources.”
I think you’re also maybe just not appreciating how much would change in 10,000 years? Like, there is no single culture that has survived 10,000 years. (Maybe one of those small tribes in the amazon? I’d still bet on there having been a lot of cultural drift there but not confidently). The Amish are only a few hundred years old. I can imagine doing a lot of moral reflection and coming to the conclusion the sun shouldn’t be eaten until all human cultures have decided it’s the right thing to do, but I do really doubt that process takes 10,000 years.
I think this is a false dilemma. If all human cultures on Earth come to the conclusion in 1000 years that they would like the Sun to be dismantled (which I very much doubt), then sure, we can do that. But at that point, we could already have built awesome industrial bases by dismantling Alpha Centauri, or just building them up by dismantling 0.1% of the Sun that doesn’t affect anything on Earth. I doubt that totally dismantling the Sun after centuries would significantly accelerate the time we reach the cosmic event horizon.
The thing that actually has costs is not to immediately bulldoze down Earth and turn it into a maximally efficient industrial powerhouse at the cost of killing every biological body. Or if the ASI has the opportunity to dismantle the Sun in a short notice (the post alludes to 10,000 years being a very conservative estimate and “the assumption that ASI eats the Sun within a few years”). But that’s not going to happen democratically. There is no way you get 51% of people [1] to vote for bulldozing down Earth and killing their biological bodies, and I very much doubt you get that vote even for dismantling the Sun in a few years and putting some fake sky around Earth for protection. It’s possible there could be a truly wise philosopher king who could with aching heart overrule everyone else’s objection and bulldoze down Earth to get those extra 200 galaxies at the edge of the Universe, but then govern the Universe wisely and benevolently in a way that people on reflection all approve of. But in practice, we are not going to get a wise philosopher king. I expect that any government that decides to destroy the Sun for the greater good, against the outrage of the vast majority of people, will also be a bad ruler of the Universe.
I also believe that AI alignment is not a binary, and even in the worlds where there is no AI takeover, we will probably get an AI we initially can’t fully tust that will follow the spirit of our commands in exotic situations we can’t really understand. In that case, it would be extremely unwise to immediately instruct it to create mind uploads (how faithful those will be?) and bulldoze down the world to turn the Sun into computronium. There are a lot of reasons for taking things slow.
Usually rationalists are pretty reasonable about these things, and endorse democratic government and human rights, and they even often like talking about the Long Reflection a taking things slow. But then they start talking about dismantling the Sun! This post can kind of defend itself that it was proposing a less immediate and horrifying implementation (though there really is a missing mood here), but there are other examples, most notably the Great Transhumanist Future song in last year’s Solstice, where a coder looks up to the burning Sun disapprovingly, and in twenty years with a big ol’ computer they will use the Sun as a battery.
I don’t know if the people talking like that are so out of touch that they believe that with a little convincing everyone will agree to dismantle the Sun in twenty years, or they would approve of an AI-enabled dictatorship bulldozing over Earth, or they just don’t think through the implications. I think it’s mostly that they just don’t think about it too hard, but I did hear people coming out in favor of actually bulldozing down Earth (usually including a step where we forcibly increase everyone’s intelligence until they agree with the leadership), and I think that’s very foolish and bad.
I have my own actual best guesses for what happens in reasonably good futures, which I can get into. (I’ll flag for now I think “preserve Earth itself for as long as possible” is a reasonable Schelling point that is compatible with many “go otherwise quite fast” plans)
I doubt that totally dismantling the Sun after centuries would significantly accelerate the time we reach the cosmic event horizon.
Why do you doubt this? (to be clear, depends on exact details. But, my original query was about a 2 year delay. Proxima Centauri is 4 lightyears away. What is your story for how only taking 0.1% of the sun’s energy while we spin up doesn’t slow us down by at least 2 years?
I have more to say but maybe should wait on your answer to that.
Mostly, I think your last comment still had it’s own missing mood of horror, and/or seemed to be assuming away any tradeoffs.
(I am with you on “many rationalists seem gung ho about this in a way I find scary”)
So, I’m with you on “hey guys, uh, this is pretty horrifying, right? Uh, what’s with the missing mood about that?”.
The issue is that not-eating-the-sun is also horrifying. i.e. see also All Possible Views About Humanity’s Future Are Wild. To not eat the sun is to throw away orders of magnitude more resources than anyone has ever thrown away before. Is it percentage-wise “a small fraction of the cosmos?”. Sure. But, (quickly checks Claude, which wrote up a fermi code snippet before answering, I can share the work if you want to doublecheck yourself), a two year delay would be… 0.00000004% of the unvierse lost beyond the lightcone horizon, which doesn’t sound like much except that’s 200 galaxies lost.
When you compare “the Amish get a Sun Replica that doesn’t change their experience”, the question is “Is it worth throwing away 80 trillion stars for the Amish to have the real thing.” It does not seem obviously worth it.
IMO there isn’t an option that isn’t at least a bit horrifying in some sense that one could have a missing mood about. And while I still feel unsettled about it, I think if I have to grieve something, makes more sense to grieve in the direction of “don’t throw away 80 trillion stars worth of resources.”
I think you’re also maybe just not appreciating how much would change in 10,000 years? Like, there is no single culture that has survived 10,000 years. (Maybe one of those small tribes in the amazon? I’d still bet on there having been a lot of cultural drift there but not confidently). The Amish are only a few hundred years old. I can imagine doing a lot of moral reflection and coming to the conclusion the sun shouldn’t be eaten until all human cultures have decided it’s the right thing to do, but I do really doubt that process takes 10,000 years.
I think this is a false dilemma. If all human cultures on Earth come to the conclusion in 1000 years that they would like the Sun to be dismantled (which I very much doubt), then sure, we can do that. But at that point, we could already have built awesome industrial bases by dismantling Alpha Centauri, or just building them up by dismantling 0.1% of the Sun that doesn’t affect anything on Earth. I doubt that totally dismantling the Sun after centuries would significantly accelerate the time we reach the cosmic event horizon.
The thing that actually has costs is not to immediately bulldoze down Earth and turn it into a maximally efficient industrial powerhouse at the cost of killing every biological body. Or if the ASI has the opportunity to dismantle the Sun in a short notice (the post alludes to 10,000 years being a very conservative estimate and “the assumption that ASI eats the Sun within a few years”). But that’s not going to happen democratically. There is no way you get 51% of people [1] to vote for bulldozing down Earth and killing their biological bodies, and I very much doubt you get that vote even for dismantling the Sun in a few years and putting some fake sky around Earth for protection. It’s possible there could be a truly wise philosopher king who could with aching heart overrule everyone else’s objection and bulldoze down Earth to get those extra 200 galaxies at the edge of the Universe, but then govern the Universe wisely and benevolently in a way that people on reflection all approve of. But in practice, we are not going to get a wise philosopher king. I expect that any government that decides to destroy the Sun for the greater good, against the outrage of the vast majority of people, will also be a bad ruler of the Universe.
I also believe that AI alignment is not a binary, and even in the worlds where there is no AI takeover, we will probably get an AI we initially can’t fully tust that will follow the spirit of our commands in exotic situations we can’t really understand. In that case, it would be extremely unwise to immediately instruct it to create mind uploads (how faithful those will be?) and bulldoze down the world to turn the Sun into computronium. There are a lot of reasons for taking things slow.
Usually rationalists are pretty reasonable about these things, and endorse democratic government and human rights, and they even often like talking about the Long Reflection a taking things slow. But then they start talking about dismantling the Sun! This post can kind of defend itself that it was proposing a less immediate and horrifying implementation (though there really is a missing mood here), but there are other examples, most notably the Great Transhumanist Future song in last year’s Solstice, where a coder looks up to the burning Sun disapprovingly, and in twenty years with a big ol’ computer they will use the Sun as a battery.
I don’t know if the people talking like that are so out of touch that they believe that with a little convincing everyone will agree to dismantle the Sun in twenty years, or they would approve of an AI-enabled dictatorship bulldozing over Earth, or they just don’t think through the implications. I think it’s mostly that they just don’t think about it too hard, but I did hear people coming out in favor of actually bulldozing down Earth (usually including a step where we forcibly increase everyone’s intelligence until they agree with the leadership), and I think that’s very foolish and bad.
And even 51% of the vote wouldn’t be enough in any good democracy to bulldoze over everyone else
I have my own actual best guesses for what happens in reasonably good futures, which I can get into. (I’ll flag for now I think “preserve Earth itself for as long as possible” is a reasonable Schelling point that is compatible with many “go otherwise quite fast” plans)
Why do you doubt this? (to be clear, depends on exact details. But, my original query was about a 2 year delay. Proxima Centauri is 4 lightyears away. What is your story for how only taking 0.1% of the sun’s energy while we spin up doesn’t slow us down by at least 2 years?
I have more to say but maybe should wait on your answer to that.
Mostly, I think your last comment still had it’s own missing mood of horror, and/or seemed to be assuming away any tradeoffs.
(I am with you on “many rationalists seem gung ho about this in a way I find scary”)