You are putting words in people’s mouths to accuse lots of people of wanting to round up the Amish and hauling them to extermination camps, and I am disappointed that you would resort to such accusations.
Yeah, maybe I just got too angry. As we discussed in other comments, I believe that astronomical acceleration perspective the real deal is maximizing the initial industrialization of Earth and its surroundings, which does require killing off (and mind uploading) the Amish and everyone else. Sure, if people are only arguing that we should only dismantle the Sun and Earth after millennia, that’s more acceptable, but I really don’t see what’s the point then, we can build out our industrial base on Alpha Centauri by then.
The part that is frustrating to me that neither the original post, nor any of the commenters arguing with me are not caveating their position with “of course, we would never want to destroy Earth before we can save all the people who want to live in their biological bodies, even though this is plausibly the majority of the cost in cosmic slow-down”. If you agree with this, please say so, and I still have quarrels about removing people to artificial planets if they don’t want to go, but I’m less horrified. But so far, no one was willing to clarify that they don’t want to destroy Earth before saving the biological people, and I really did hear people say in private conversations things like “we will immediately kill all the bodies and upload the minds, the people will thank us later once they understand better” and things of that sort, which makes me paranoid.
Ben, Oliver, Raemon, Jessica, are you willing to commit to not wanting to destroy Earth if it requires killing the biological bodies of a significant number of non-consenting people? If so, my ire was not directed against you and I apologize to you.
It is good to have deontological commitments about what you would do with a lot of power. But this situation is very different from “a lot of power”, it’s also “if you were to become wiser and more knowledgeable than anyone in history so far”. One can imagine the Christians of old asking for a commitment that “If you get this new scientific and industrial civilization that you want in 2,000 years from now, will you commit to following the teachings of Jesus?” and along the way I sadly find out that even though it seemed like a good and moral commitment at the time, it totally screwed my ability to behave morally in the future because Christianity is necessarily predicated on tons of falsehoods and many of its teachings are immoral.
But there is some version of this commitment I think might be good to make… something like “Insofar as the players involved are all biological humans, I will respect the legal structures that exist and the existence of countries, and will not relate to them in ways that would be considered worthy of starting a war in its defense”. But I’m not certain about this, for instance what if most countries in the world build 10^10 digital minds and are essentially torturing them? I may well wish to overthrow a country that is primarily torture with a small number of biological humans sitting on thrones on top of these people, and I am not willing to commit not to do that presently.
I understand that there are bad ethical things one can do with post-singularity power, but I do not currently see a clear way to commit to certain ethical behaviors that will survive contact with massive increases in knowledge and wisdom. I am interested if anyone has made other commitments about post-singularity life (or “on the cusp of singularity life”) that they expect to survive contact with reality?
Added: At the very least I can say that I am not going to make commitments to do specific things that violate my current ethics. I have certainly made no positive commitment to violate people’s bodily autonomy nor have such an intention.
Fair, I also haven’t made any specific commitments, I phrased it wrongly. I agree there can be extreme scenarios with trillions of digital minds tortured where you’d maybe want to declare war on the. rest of society. But I would still like people to write down that “of course, I wouldn’t want to destroy Earth before we can save all the people who want to live in their biological bodies, just to get a few years of acceleration in the cosmic conquest”. I feel a sentence like this should really have been included in the original post about dismantling the Sun, and until people are not willing to write this down, I remain paranoid that they would in fact haul the Amish the extermination camps if it feels like a good idea at the time. (As I said, I met people who really held this position.)
(Meta: Apologies for running the clock, but it is 1:45am where I am and I’m too sleepy to keep going on this thread, so I’m bowing out for tonight. I want to respond further, but I’m on vacation right now so I do wish to disclaim any expectations of a speedy follow-up.)
You are putting words in people’s mouths to accuse lots of people of wanting to round up the Amish and hauling them to extermination camps, and I am disappointed that you would resort to such accusations.
Yeah, maybe I just got too angry. As we discussed in other comments, I believe that astronomical acceleration perspective the real deal is maximizing the initial industrialization of Earth and its surroundings, which does require killing off (and mind uploading) the Amish and everyone else. Sure, if people are only arguing that we should only dismantle the Sun and Earth after millennia, that’s more acceptable, but I really don’t see what’s the point then, we can build out our industrial base on Alpha Centauri by then.
The part that is frustrating to me that neither the original post, nor any of the commenters arguing with me are not caveating their position with “of course, we would never want to destroy Earth before we can save all the people who want to live in their biological bodies, even though this is plausibly the majority of the cost in cosmic slow-down”. If you agree with this, please say so, and I still have quarrels about removing people to artificial planets if they don’t want to go, but I’m less horrified. But so far, no one was willing to clarify that they don’t want to destroy Earth before saving the biological people, and I really did hear people say in private conversations things like “we will immediately kill all the bodies and upload the minds, the people will thank us later once they understand better” and things of that sort, which makes me paranoid.
Ben, Oliver, Raemon, Jessica, are you willing to commit to not wanting to destroy Earth if it requires killing the biological bodies of a significant number of non-consenting people? If so, my ire was not directed against you and I apologize to you.
It is good to have deontological commitments about what you would do with a lot of power. But this situation is very different from “a lot of power”, it’s also “if you were to become wiser and more knowledgeable than anyone in history so far”. One can imagine the Christians of old asking for a commitment that “If you get this new scientific and industrial civilization that you want in 2,000 years from now, will you commit to following the teachings of Jesus?” and along the way I sadly find out that even though it seemed like a good and moral commitment at the time, it totally screwed my ability to behave morally in the future because Christianity is necessarily predicated on tons of falsehoods and many of its teachings are immoral.
But there is some version of this commitment I think might be good to make… something like “Insofar as the players involved are all biological humans, I will respect the legal structures that exist and the existence of countries, and will not relate to them in ways that would be considered worthy of starting a war in its defense”. But I’m not certain about this, for instance what if most countries in the world build 10^10 digital minds and are essentially torturing them? I may well wish to overthrow a country that is primarily torture with a small number of biological humans sitting on thrones on top of these people, and I am not willing to commit not to do that presently.
I understand that there are bad ethical things one can do with post-singularity power, but I do not currently see a clear way to commit to certain ethical behaviors that will survive contact with massive increases in knowledge and wisdom. I am interested if anyone has made other commitments about post-singularity life (or “on the cusp of singularity life”) that they expect to survive contact with reality?
Added: At the very least I can say that I am not going to make commitments to do specific things that violate my current ethics. I have certainly made no positive commitment to violate people’s bodily autonomy nor have such an intention.
Fair, I also haven’t made any specific commitments, I phrased it wrongly. I agree there can be extreme scenarios with trillions of digital minds tortured where you’d maybe want to declare war on the. rest of society. But I would still like people to write down that “of course, I wouldn’t want to destroy Earth before we can save all the people who want to live in their biological bodies, just to get a few years of acceleration in the cosmic conquest”. I feel a sentence like this should really have been included in the original post about dismantling the Sun, and until people are not willing to write this down, I remain paranoid that they would in fact haul the Amish the extermination camps if it feels like a good idea at the time. (As I said, I met people who really held this position.)
(Meta: Apologies for running the clock, but it is 1:45am where I am and I’m too sleepy to keep going on this thread, so I’m bowing out for tonight. I want to respond further, but I’m on vacation right now so I do wish to disclaim any expectations of a speedy follow-up.)