Fair, I also haven’t made any specific commitments, I phrased it wrongly. I agree there can be extreme scenarios with trillions of digital minds tortured where you’d maybe want to declare war on the. rest of society. But I would still like people to write down that “of course, I wouldn’t want to destroy Earth before we can save all the people who want to live in their biological bodies, just to get a few years of acceleration in the cosmic conquest”. I feel a sentence like this should really have been included in the original post about dismantling the Sun, and until people are not willing to write this down, I remain paranoid that they would in fact haul the Amish the extermination camps if it feels like a good idea at the time. (As I said, I met people who really held this position.)
David Matolcsi
As I explain in more detail in my other comment, I expect market based approaches to not dismantle the Sun anytime soon. I’m interested if you know of any governance structure that you support that you think will probably lead to dismantling the Sun within the next few centuries.
I feel reassured that you don’t want to Eat the Earth while there are still biological humans who want to live on it.
I still maintain that under governance systems I would like, I would expect the outcome to be very conservative with the solar system in the next thousand years. Like one default governance structure I quite like is to parcel out the Universe equally among the people alive during the Singularity, have a binding constitution on what they can do on their fiefdoms (no torture, etc), and allow them to trade and give away their stuff to their biological and digital descendants. There could also be a basic income coming to all biological people,[1] though not to digital as it’s too easy to mass-produce them.
One year of delay in cosmic expansion costs us around 1 in a billion of the reachable Universe under some assumptions on where the grabby aliens are (if they exist). One year also costs us around 1 in a billion of the Sun’s mass being burned, if like Habryka you care about using the solar system optimally for the sake of the biological humans who want to stay. So one year of delay can be bought by 160 people paying out 10% of their wealth. I really think that you won’t do things like moving the Earth closer to the Sun and things like that in the next 200 years, there will just always be enough people to pay out, it just takes 10,000 traditionalist families, literally the Amish could easily do it. And it won’t matter much, the cosmic acceleration will soon become a moot point as we build out other industrial bases, and I don’t expect the biological people to feel much of a personal need to dismantle the Sun anytime soon. Maybe in 10,000 years the objectors will run out of money, and the bio people either overpopulate or have expensive hobbies like building planets to themselves and decide to dismantle the Sun, though I expect them to be rich enough to just haul in matter from other stars if they want to.
By the way, I recommend Tim Underwood’s sci-fi, The Accord, as a very good exploration of these topics, I think it’s my favorite sci-fi novel.
As for the 80 trillions stars, I agree it’s a real loss, but for me this type of sadness feels “already priced in”. I already accepted that the world won’t and shouldn’t be all my personal absolute kingdom, so other people’s decision will cause a lot of waste from my perspective, and 0.00000004% is just a really negligible part of this loss. In this, I think my analogy to current government is quite apt, I feel similarly about current governments, that I already accepted that the world will be wasteful compared to the rule of a dictatorship perfectly aligned with me, but that’s how it needs to be.
- ^
Though you need to pay attention to overpopulation. If the average biological couple has 2.2 children, the Universe runs out of atoms to support humans in 50 thousand years. Exponential growth is crazy fast.
- ^
I maintain that biological humans will need to do population control at some point. If they decide that enacting the population control in the solar system at a later population leve is worth it for them to dismantle the Sun, then they can go for it. My guess is that they won’t, and will have population control earlier.
I think that the coder looking up and saying that the Sun burning is distasteful but the Great Transhumanist Future will come in 20 years, along with a later mention of “the Sun is a battery”, together implies that the Sun is getting dismantled in the near future. I guess you can debate in how strong the implication is, maybe they just want to dismantle the Sun in the long term, and currently only using the Sun as a battery in some benign way, but I think that’s not the most natural interpretation.
Yeah, maybe I just got too angry. As we discussed in other comments, I believe that astronomical acceleration perspective the real deal is maximizing the initial industrialization of Earth and its surroundings, which does require killing off (and mind uploading) the Amish and everyone else. Sure, if people are only arguing that we should only dismantle the Sun and Earth after millennia, that’s more acceptable, but I really don’t see what’s the point then, we can build out our industrial base on Alpha Centauri by then.
The part that is frustrating to me that neither the original post, nor any of the commenters arguing with me are not caveating their position with “of course, we would never want to destroy Earth before we can save all the people who want to live in their biological bodies, even though this is plausibly the majority of the cost in cosmic slow-down”. If you agree with this, please say so, and I still have quarrels about removing people to artificial planets if they don’t want to go, but I’m less horrified. But so far, no one was willing to clarify that they don’t want to destroy Earth before saving the biological people, and I really did hear people say in private conversations things like “we will immediately kill all the bodies and upload the minds, the people will thank us later once they understand better” and things of that sort, which makes me paranoid.
Ben, Oliver, Raemon, Jessica, are you willing to commit to not wanting to destroy Earth if it requires killing the biological bodies of a significant number of non-consenting people? If so, my ire was not directed against you and I apologize to you.
I expect non-positional material goods to be basically saturated for Earth people in a good post-Singularity world, so I don’t think you can promise them to become twice as rich. And also, people dislike drastic change and new things they don’t understand. 20% of the US population refused the potentially life-saving covid vaccine out of distrust of new things they don’t understand. Do you think they would happily move to a new planet with artificial sky maintained by supposedly benevolent robots? Maybe you could buy off some percentage of the population if material goods weren’t saturated, but surely not more than you could convince to get the vaccine? Also, don’t some religions (Islam?) have specific laws about what to do at sunrise and sunset and so on? Do you think all the imams would go along with moving to the new artificial Earth? I really think you are out of touch with the average person on this one, but we can go out to the streets and interview some people on the matter, though Berkeley is maybe not the most representative place for this.
(Again, if you are talking about cultural drift over millennia, that’s more plausible, though I’m below 50% they would dismantle the Sun. But I’m primarily arguing against dismantling the Sun within twenty years of the Singularity.)
Are you arguing that if technologically possible, the Sun should be dismantled in the first few decades after the Singularity, as it is implied in the Great Transhumanist Future song, the main thing I’m complaining about here? In that case, I don’t know of any remotely just and reasonable (democratic, market-based or other) governance structure that would allow that to happen given how the majority of people feel.
If you are talking about population dynamics, ownership and voting shifting over millennia to the point that they decide to dismantle the Sun, then sure, that’s possible, though that’s not what I expect to happen, see my other comment on market trades and my reply to Habryka on population dynamics.
You mean that people on Earth and the solar system colonies will have enough biological children, and space travel to other stars for biological people will be hard enough that they will want the resources from dismantling the Sun? I suppose that’s possible, though I expect they will put some kind of population control for biological people in place before that happens. I agree that also feels aversive, but at some point it needs to be done anyway, otherwise exponential population growth just brings us back to the Malthusian limit a few ten thousand years from now even if we use up the whole Universe. (See Tim Underwood’s excellent rationalist sci-fi novel on the topic.)
If you are talking about ems and digital beings, not biological humans, I don’t think they will and should have have decision rights over what happens with the solar system, as they can simply move to other stars.
I agree that not all decisions about the cosmos should be made on a majoritarian democratic way, but I don’t see how replacing the Sun with artificial light can be done by market forces under normal property rights. I think you are currently would not be allowed to build a giant glass dome around someone’s pot of land, and this feels at least that strong.
I’m broadly sympathetic to having property rights and markets in the post-Singularity future, and probably the people will scope-sensitive and longtermist preferences will be able to buy out the future control of far-away things from the normal people who don’t care about these too much. But these trades will almost certainly result the solar system being owned by a coalition of normal people, except if they start with basically zero capital. I don’t know how you imagine the initial capital allocation to look like in your market-based post-Singularity world, but if the vast majority of the population doesn’t have enough control to even save the Sun, then probably something went deeply wrong.
I agree that I don’t viscerally feel the loss of the 200 galaxies, and maybe that’s a deficiency. But I still find this position crazy. I feel this is a decent parallel dialogue:
Other person: “Here is a something I thought of that would increase health outcomes in the world by 0.00000004%.”
Me: “But surely you realize that this measure is horrendously unpopular, and the only way to implement it is through a dictatorial world government.”
Other person: “Well yes, I agree it’s a hard dilemma, but on absolute terms, 0.00000004% of the world population is 3 people, so my intervention would save 3 lives. Think about how horrible tragedy every death is, I really feel there is a missing mood here when you don’t even consider the upside of my proposal.”
I do feel sad when the democratic governments of the world screw up policy in a big way according to my beliefs and values (as they often do). I still believe that democracy is the least bad form of government, but yes, I feel the temptation of how much better it would be if we were instead governed by perfect philosopher king who happens to be aligned with my values. But 0.00000004% is just a drop in the ocean.Similarly, I believe that we should try to maintain a relatively democratic form of government at least in the early AI age (and then the people can slowly figure out if they find something better than democracy). And yes, I expect that democracies will do a lot of incredibly wasteful, stupid, and sometimes evil things by my lights, and I will sometimes wish if somehow I could have become the philosopher king. That’s just how things always are. But leaving Earth alone really is chump change, and won’t be among the top thousand things I disagree with democracies on.
(Also, again, I think there will also be a lot of value in conservatism and caution, especially that we probably won’t be able to fully trust our AIs in the most complicated issues. And I also think there is something intrinsically wrong about destroying Earth, I think that if you cut out the world’s oldest tree to increase health outcomes by 0.00000004%, you are doing something wrong, and people have a good reason to distrust you.)
Yes, I wanted to argue something like this.
I think this is a false dilemma. If all human cultures on Earth come to the conclusion in 1000 years that they would like the Sun to be dismantled (which I very much doubt), then sure, we can do that. But at that point, we could already have built awesome industrial bases by dismantling Alpha Centauri, or just building them up by dismantling 0.1% of the Sun that doesn’t affect anything on Earth. I doubt that totally dismantling the Sun after centuries would significantly accelerate the time we reach the cosmic event horizon.
The thing that actually has costs is not to immediately bulldoze down Earth and turn it into a maximally efficient industrial powerhouse at the cost of killing every biological body. Or if the ASI has the opportunity to dismantle the Sun in a short notice (the post alludes to 10,000 years being a very conservative estimate and “the assumption that ASI eats the Sun within a few years”). But that’s not going to happen democratically. There is no way you get 51% of people [1] to vote for bulldozing down Earth and killing their biological bodies, and I very much doubt you get that vote even for dismantling the Sun in a few years and putting some fake sky around Earth for protection. It’s possible there could be a truly wise philosopher king who could with aching heart overrule everyone else’s objection and bulldoze down Earth to get those extra 200 galaxies at the edge of the Universe, but then govern the Universe wisely and benevolently in a way that people on reflection all approve of. But in practice, we are not going to get a wise philosopher king. I expect that any government that decides to destroy the Sun for the greater good, against the outrage of the vast majority of people, will also be a bad ruler of the Universe.
I also believe that AI alignment is not a binary, and even in the worlds where there is no AI takeover, we will probably get an AI we initially can’t fully tust that will follow the spirit of our commands in exotic situations we can’t really understand. In that case, it would be extremely unwise to immediately instruct it to create mind uploads (how faithful those will be?) and bulldoze down the world to turn the Sun into computronium. There are a lot of reasons for taking things slow.
Usually rationalists are pretty reasonable about these things, and endorse democratic government and human rights, and they even often like talking about the Long Reflection a taking things slow. But then they start talking about dismantling the Sun! This post can kind of defend itself that it was proposing a less immediate and horrifying implementation (though there really is a missing mood here), but there are other examples, most notably the Great Transhumanist Future song in last year’s Solstice, where a coder looks up to the burning Sun disapprovingly, and in twenty years with a big ol’ computer they will use the Sun as a battery.
I don’t know if the people talking like that are so out of touch that they believe that with a little convincing everyone will agree to dismantle the Sun in twenty years, or they would approve of an AI-enabled dictatorship bulldozing over Earth, or they just don’t think through the implications. I think it’s mostly that they just don’t think about it too hard, but I did hear people coming out in favor of actually bulldozing down Earth (usually including a step where we forcibly increase everyone’s intelligence until they agree with the leadership), and I think that’s very foolish and bad.
- ^
And even 51% of the vote wouldn’t be enough in any good democracy to bulldoze over everyone else
- ^
I might write a top level post or shortform about this at some point. I find it baffling how casually people talk about dismantling the Sun around here. I recognize that this post makes no normative claim that we should do it, but it doesn’t say that it would be bad either, and expects that we will do it even if humanity remains in power. I think we probably won’t do it if humanity remains in power, we shouldn’t do it, and if humanity disassembles the Sun, it will probably happen for some very bad reason, like a fanatical dictatorship getting in power.
If we get some even vaguely democratic system that respects human rights at least a little, then many people (probably the vast majority) will want to live on Earth in their physical bodies and many will want to have children, and many of those children will also want to live on Earth and have children on their own. I find it unlikely that all subcultures that want this will die out on Earth in 10,000 years, especially considering the selections effects: the subcultures that prefer to have natural children on Earth are the ones that natural selection favors on Earth. So the scenarios when humanity dismantles the Sun probably involve a dictatorship rounding up the Amish and killing them while maybe uploading their minds somewhere, against all their protestation. Or possibly rounding up the Amish, and forcibly “increasing their intelligence and wisdom” by some artificial means, until they realize that their “coherent extrapolated volition” was in agreement with the dictatorship all along, and then killing off their bodies after their new mind consents. I find this option hardly any better. (Also, it’s not just the Amish you are hauling to the extermination camps kicking and screaming, but my mother too. And probably your mother as well. Please don’t do that.)
Also, I think the astronomical waste is probably pretty negligible. You can probably create a very good industrial base around the Sun with just some Dyson swarm that doesn’t take up enough light to be noticeable from Earth. And then you can just send out some probes to Alpha Centauri and the other neighboring stars to dismantle them if we really want to. How much time do we lose by this? My guess is at most a few years, and we probably want to take some years anyway to do some reflection before we start any giant project.
People sometimes accuse the rationalist community of being aggressive naive utilitarians, who only believe that the AGI is going to kill everyone, because they are only projecting themselves to it, as they also want to to kill everyone if they get power, so they can establish their mind-uploaded, we-are-the-grabby-aliens, turn-the-stars-into-computronium utopia a few months earlier that way. I think this accusation is mostly false, and most rationalists are in fact pretty reasonable and want to respect other people’s rights and so on. But when I see people casually discussing dismantling the Sun, with only one critical comment (Mikhail’s) that we shouldn’t do it, and it shows up in Solstice songs as a thing we want to do in the Great Transhumanist Future twenty years from now, I start worrying again that the critics are right, and we are the bad guys.
I prefer to think that it’s not because people are in fact happy about massacring the Amish and their own mothers, but because dismantling the Sun is a meme, and people don’t think through what it means. Anyway, please stop.
(Somewhat relatedly, I think it’s not obvious at all that if a misaligned AGI takes over the world, it will dismantle the Sun. It is more likely to do it than humanity would, but still, I don’t know how you could be any confident that the misaligned AI that first takes over will be the type of linear utilitarian optimizer that really cares about conquering the last stars at the edge of the Universe, so needs to dismantle the star in order to speed up its conquest with a few years.)
What is an infra-Bayesian Super Mario supposed to mean? I studied infra-Bayes under Vanessa for half a year, and I have no idea what this could possibly mean. I asked Vanessa when this post came out and she also said she can’t guess what you might mean under this. Can you explain what this is? It makes me very skeptical that the only part of the plan I know something about seems to be nonsense.
Also, can you give more information ir link to a resource on what Davidad’s team is currently doing? It looks like they are the best funded AI safety group that currently exist (except if you count Anthropic), but I never hear about them.
I don’t agree with everything in this post, but I think it’s a true an aunderappreciated point that “if your friend dies in a random accident, that’s actually only a tiny loss accorfing to MWI.”
I usually use this point to ask people to retire the old argument that “Religious people don’t actually belive in their religion, otherwise they would be no more sad at the death of a loved one than if their loved one sailed to Australia.” I think this “should be” true of MWI believers too, and we still feel very sad when a loved one dies in accident.
I don’t think this means people don’t “really belive” in MWI, it’s just that MWI and religious afterlife both start out as System 2 beliefs, and it’s hard to internalize them on System 1. But religious people are already well aware that it’s hard internalize Faith on System 1 (Mere Christianity has a full chapter on the topic), so saying that “they don’t really belueve in their religion because they grieve” is an unfair dig.
I fixed some misunderstandable parts, I meant the $500k being the LW hosting + Software subscriptions and the Dedicated software + accounting stuff together. And I didn’t mean to imply that the labor cost of the 4 people is $500k, that was a separate term in the costs.
Is Lighthaven still cheaper if we take into account the initial funding spent on it in 2022 and 2023? I was under the impression that buying Lighthaven is one of the things that made a lot of sense when the community believed it would have access to FTX funding, and once we bought it, it makes sense to keep it, but we wouldn’t have bought it once FTX was out of the game. But in case this was a misunderstanding and Lighthaven saves money in the long run compared to the previous option, that’s great news.
- 2 Dec 2024 5:21 UTC; 9 points) 's comment on (The) Lightcone is nothing without its people: LW + Lighthaven’s big fundraiser by (
I donated $1000. Originally I was worried that this is a bottomless money-pit, but looking at the cost breakdown, it’s actually very reasonable. If Oliver is right that Lighthaven funds itself apart from the labor cost, then the real costs are $500k for the hosting, software and accounting cost of LessWrong (this is probably an unavoidable cost and seems obviously worthy of being philanthropically funded), plus paying 4 people (equivalent to 65% of 6 people) to work on LW moderation and upkeep (it’s an unavoidable cost to have some people working on LW, 4 seems probably reasonable, and this is also something that obviously should be funded), plus paying 2 people to keep Lighthaven running (given the surplus value Lighthaven generates, it seems reasonable to fund this), plus a one-time cost of 1 million to fund the initial cost of Lighthaven (I’m not super convinced it was a good decision to abandon the old Lightcone offices for Lighthaven, but I guess it made sense in the funding environment of the time, and once we made this decision, it would be silly not to fund the last 1 million of initial cost before Lighthaven becomes self-funded). So altogether I agree that this is a great thing to fund and it’s very unfortunate that some of the large funders can’t contribute anymore.
(All of this relies on the hope that Lighthaven actually becomes self-funded next year. If it keeps producing big losses, then I think the funding case will become substantially worse. But I expect Oliver’s estimates to be largely trustworthy, and we can still decide to decrease funding in later years if it turns out Lighthaven isn’t financially sustainable.)
I’m considering donating. Can you give us a little more information on breakdown of the costs? What are typical large expenses that the 1.6 million upkeep of Lighthaven consists of? Is this a usual cost for a similar sized event space, or is something about the location or the specialness of the place that makes it more expensive?
How much money does running LW cost? The post says it’s >1M, which somewhat surprised me, but I have no idea what’s the usual cost of running such a site is. Is the cost mostly server hosting or salaries for content moderation or salaries for software development or something I haven’t thought of?
Does anyone know of a not peppermint flavored zinc acetate lozenge? I really dislike peppermint, so I’m not sure it would be worth it to drink 5 peppermint flavored glasses of water a day to decrease the duration of cold with one day, and I haven’t found other zinc acetate lozenge options yet, the acetate version seems to be rare among zing supplement. (Why?)