I think you’re mixing up “uploads are impossible”, “uploading people who want to be uploaded is bad”, and “forcibly uploading people whether they want that or not is bad”. These are all very different topics. In this context, I wonder whether you would have been better off splitting them up into different blog posts. At the very least, the title is a bit misleading.
And the third thing there (“forcibly uploading people whether they want that or not is bad”) is not controversial. You say that some people are in favor of universal uploading of everyone including people who don’t want to be uploaded, but none of your links are to people who endorse that position. That’s a pretty crazy position.
To me, it’s obvious large parts of the personality are stored in the body.
I dunno, like, I don’t want to minimize the trauma of spinal injury, but my understanding is that people who become quadriplegic are still recognizably the same people, and they still feel like the same people, and their friends and family still see them as the same people, especially once they get over the initial shock, and the sudden wrenching changes in their day-to-day life and career aspirations, etc. I’m open to being corrected on that.
either have to assume only a small subset of molecular information is relevant (likely a false assumption) OR you have to identify the exact large subset (more on this later) OR you run into thermodynamic information issues where you can’t actually scan a physical object to desired “each molecular location and velocity” accuracy without destroying it. This also ignores any quantum issues that could make everything even more complicated.
The first one. I think the brain is a machine, and it’s not such a complicated machine as to be forever beyond human comprehension—after all it has to be built by a mere 25,000 genes. Some things the machine does by design, and some things it does by chance. Like, “I am me” regardless of whether I’m in a clean room at 20°C or a dusty room at 21°C, but the dust and temperature have a zillion consequences on my quantum-mechanical state. So whatever “I am me” means, it must be only dependent on degrees of freedom that are pretty robust to environmental perturbations. And that means we have a hope of measuring them. (If we know how it works, and hence what degrees of freedom we need to measure!)
It’s a bit like measuring a computer chip—you don’t need to map every atom to be able to emulate it. You won’t emulate SEUs that way, but you didn’t really want to emulate the SEUs anyway.
There is a lot to unpack. I have definitely heard from leaders of the community claims to the tune of “biology is over,” without further explanation of what exactly that means or what specific steps are expected to happen when the majority of people disagree with this. The lack of clarity here makes it hard to find a specific claim of “I will forcefully do stuff to people they don’t like,” but me simply saying “I and others want to actually have what we think of as “humans” keep on living” is met with some pushback.
You seem to be saying that the “I” or “Self” of people is somehow static through large possible changes to the body. While on a social and legal level (family and friends recognize them), we need to have a simple shorthand for what constitutes the same person. The social level is not the same as the “molecular level.”
On a molecular level, everything impacts cognition. Good vs bad food impacts cognition, taking a cold vs warm shower impacts cognition. If you read Impro, even putting on a complicated mask during a theater performance impacts cognition.
“I am me,” whatever you think of “as yourself” is a product of your quantum-mechanical state. The body fights really hard to preserve some aspects of said state to be invariant. If the temperature of the room increases 1C nothing much might change, however, if the body loses the battle and your core temperature increases 1C, you likely have either a fever or heat-related problems with the corresponding impact on cognition. Even if the room is dusty enough, people can become distressed from the sight or lack of oxygen.
So if you claim that a small portion of molecular information is relevant in the construction of self, you will fail to capture all the factors that are relevant in affecting cognition and behavior. Now only considering a portion of the body’s molecules doesn’t solve the physics problem of needing to have a molecular level info without destroying the body. You would also need to hope that the relevant information is more “macro-scale” than molecules to get around the thermodynamics issues. However, every approximation one makes away from perfect simulation is likely to drift the cognition and behavior further from the person, which makes the verification problem (did it actually succeed) harder.
This is also why it’s a single post. The problems form a “stack” in which fuzzy or approximate solutions to the bottom of the stack make the problems above harder in the other layers of the stack.
Now, there is a particular molecular level worth mentioning. The DNA of people is the most stable molecular construct in the body. This is preserved by the body with far more care than whatever we think of as cognition. How much cognition is shared between a newborn and the same 80-year old? DNA is also build with redundancies which means that the majority of the body remains intact after a piece of it is collected with DNA. However, i don’t think that “write one’s DNA to the blockchain” is what people think of when they say uploads.
I have definitely heard from leaders of the community claims to the tune of “biology is over,” without further explanation of what exactly that means or what specific steps are expected to happen when the majority of people disagree with this. The lack of clarity here makes it hard to find a specific claim of “I will forcefully do stuff to people they don’t like,” but me simply saying “I and others want to actually have what we think of as “humans” keep on living” is met with some pushback.
I am very highly confident that “leaders of the community” would be unhappy with a future where everyone who wants to live out their lives as biological humans are unable to do so. I don’t know what you heard, or from who, but you must have misunderstood it.
I think it’s possible that said biological humans will find that they have no gainful employment opportunities, because anything they can do, can alternatively be done by a robot who charges $0.01/hour and does a much better job. If that turns out to be the case, I hope that Universal Basic Income will enable those people to have a long rich “early retirement” full of travel, learning, friendship, family, community, or whatever suits them.
I also think it’s pretty likely that AI will wipe out the biological humans, using plagues and missile strikes and so on. In the unlikely event that there are human uploads, I would expect them to get killed by those same AIs as well. Obviously, I’m not happy to have that belief, and I am working to make it not come true.
Speaking of which, predicting that something will happen is different from (in fact, unrelated to) hoping that it will happen. I’ve never quite wrapped my mind around the fact that people mix these two things up so often. But that’s not a mistake that “community leaders” would make. I wonder if the “biology is over” claim was a prediction that you mistook as being a hope? By the same token, “uploading is bad” and “uploading is impossible” are not two layers of the same stack, they’re two unrelated claims. All four combinations (bad+impossible, good+impossible, bad+possible, good+possible) are perfectly coherent positions for a person to hold.
I want our values to be able to mature! I want us to figure out how to build sentient minds in silicon, who have different types of wants and desires and joys
I don’t want that, instead i want a tool intelligence that augments me by looking at my words and actions. Digital minds (not including “uploads”) are certainly possible and highly undesirable for most people simply due to competition for resources and higher potential for conflict. I don’t buy lack-of-resource-scarcity for a second.
uploading minds; copying humans; interstellar probes that aren’t slowed down by needing to cradle bags of meat, ability to run civilizations on computers in the cold of space …
in the long term, i think you’re looking at stuff at least as crazy as people running thousands of copies of their own brain at 1000x speedup and i think it would be dystopian to try to yolk them to, like, the will of the flesh-bodied American taxpayers (or whatever).
“cradle bags of meat” is a pretty revealing phrase about what he thinks of actual humans and biology
In general, the idea to having regular people now and in the future have any say about the future of digital minds seems like an anathema here. There is no acknowledgement that this is the MINORITY position and that there is NO REASON that other people would go along with that. I don’t know how to interpret these pronouncements that go against the will of the people other than a full-blown intention to use state violence against people who disagree. Even if you can convince one nation to brutally suppress protests against digital minds, doesn’t mean others will follow suit.
This is a set of researchers that generally takes egalitarianism, non-nationalism, concern for future minds, non-carbon-chauvinism, and moral humility for granted, as obvious points of background agreement; the debates are held at a higher level than that.
“non-carbon-chauvinism” is such a funny attempt at an insult. You have already made up an insult for not believing in something that doesn’t exist. Typical atheism:).
The whole phrase comes off as “people without my exact brand of really weird ideas” are wrong and not invited to the club. You can exclude people all you want, just don’t claim that anything like this represents actual human values. I take this with the same level of seriousness as me pronouncing “atheism has no place in technology because it does not have the favor of the Machine God”
None of those say (or imply) that we should forcibly upload people who don’t want to be uploaded. I think nobody believes that, and I think you should edit your post to not suggest that people do.
By analogy:
I can believe that people who don’t want to go to Mars are missing out on a great experience, but that doesn’t mean I’m in favor of forcing people who don’t want to go to Mars to go to Mars.
I can desire for it to be possible to go to Mars, but that doesn’t mean I’m in favor of forcing people who don’t want to go to Mars to go to Mars.
I can advocate for future Martians to not be under tyrannical control of Earthlings, with no votes or political rights, but that doesn’t mean I’m in favor of forcing people who don’t want to go to Mars to go to Mars.
I can believe that the vast majority of future humans will be Martians and all future technology will be invented by them, but that doesn’t mean I’m in favor of forcing people who don’t want to go to Mars to go to Mars.
Right?
The two people you cite have very strong libertarian tendencies. They do NOT have the belief “something is a good idea for an individual” ---> ”...therefore obviously the government should force everyone to do it” (a belief that has infected other parts of political discourse, a.k.a. everything must be either mandatory or forbidden).
If your belief is “in the unlikely event that uploading is possible at all, and somebody wants to upload, then the government should prevent them from doing so”—as it seems to be—then you should say that explicitly, and then readers can see for themselves which side of this debate is in favor of people imposing their preferences on other people.
Robin is a libertarian, Nate used to be, but after the whole calls to “bomb datacenters” and vague “regulation,” calls from the camp, i don’t buy libertarian credentials.
A specific term of “cradle bags of meat” is de-humanization. Many people view dehumanization as evidence of violent intentions. I understand you do not, but can you step away and realize that some people are quite sensitive to the phrasing?
More-over when i say “forcefully do stuff to people they don’t like”, this is a general problem. You seem to interpret this as only taking about “forcing people to be uploaded” which is a specific sub-problem. There are many other instances of this general problem which i refere to such as
a) forcing people to take care of economically unproductive digital minds.
it’s clear that Nate sees “American tax-payers” with contempt. However depending on the specific economics of digital minds, people are wary of being forced to give up resources to something they don’t care about.
or
b) ignoring existing people’s will with regards to dividing the cosmic endownment.
In your analogy, if a small group of people wants to go to Mars and take a small piece of it, that’s ok. However if they wish to go to Mars and then forcefully prevent other people from going there at all because they claim they need all of Mars to run some computation, this is not ok.
It’s clear Nate doesn’t see any problem with that with the interstellar probes comment.
Again, the general issue here, is that I am AWARE of the disagreement about whether on not digital minds (i am not taking about uploads, but other simpler to make categories of digital minds) are ok to get created. Despite me being in the majority position of “only create tool-AIs”, I am acknowledging that people might disagree. There are ways to resolve this peacefully (divide the star systems between groups). However, despite being in the minority position LW seems to wish to IGNORE every one else’s vision of the future and call them insults like “bags of meat” and “carbon—chauvinists”.
I think when you say “force the idea of “digital life,” “digital minds” or “uploads” onto people” and such, you are implying that there are people who are in favor of uploading everyone including people who don’t want to be uploaded. If that’s not what you believe, then I think you should change the wording.
This isn’t about vibes, it’s about what people actually say, and what they believe. I think you are misreading vibes in various ways, and therefore you should stick to what they actually say. It’s not like Robin Hanson and Eliezer are shy about writing down what they think in excruciating detail online. And they do not say that we should upload everyone including people who don’t want to be uploaded. For example, here’s an Age of Em quote which (I claim) is representative of what Robin says elsewhere:
Some celebrate our biologically maladaptive behaviors without hoping for collective control of evolution. They accept that future evolution will select for preferences different from theirs, but they still want to act on the preferences they have for as long as they have them. These people have embraced a role as temporary dreamtime exceptions to a larger pattern of history.
Note that he doesn’t say is that these “some” are a problem and we need to fix this problem by force of law. Here’s another quote:
Attempts to limit the freedom of such young people to voluntarily choose destructive scanning could result in big conflicts.
Later on, when scans become non-destructive and scanning costs fall, scans are done on far more people, including both old people with proven productivity and adaptability, and younger people with great promise to later become productive and adaptable. Eventually most humans willing to be scanned are scanned, to provide a large pool of scans to search for potentially productive workers. By then, many early scans may have gained first-mover advantages over late arrivals. First movers will have adapted more to em environments, and other ems and other systems will have adapted more to them.
Emphasis added. There is nothing in the book that says we should or will forcibly upload people who don’t want to be uploaded, and at least this one passage explicitly to the contrary (I think there are other passages along the same lines).
if they wish to go to Mars and then forcefully prevent other people from going there at all because they claim they need all of Mars to run some computation, this is not ok.
It’s clear Nate doesn’t see any problem with that with the interstellar probes comment.
I’m confused. In our analogy (uploading ↔ going to Mars), “go to Mars and then forcefully prevent other people from going there” would correspond to “upload and then forcefully prevent other people from uploading”. Since when does Nate want to prevent people from uploading? That’s the opposite of what he wants.
forcing people to take care of economically unproductive digital minds
I’m not sure why you expect digital minds to be unproductive. Well, I guess in a post-AGI era, I would expect both humans and uploads to be equally economically unproductive. Is that what you’re saying?
I agree that a superintelligent AGI sovereign shouldn’t give equal Universal Basic Income shares to each human and each digital mind while also allowing one person to make a gazillion uploaded copies of themselves which then get a gazillion shares while the biological humans only get one share each. That’s just basic fairness. But if one person switches from a physical body to an upload, that’s not taking shares away from anyone.
ignoring existing people’s will with regards to dividing the cosmic endownment
There’s a legitimate (Luddite) position that says “I am a normal human running at human speed in a human body. And I do not want to be economically outcompeted. And I don’t want to be unemployable. And I don’t want to sit on the sidelines while history swooshes by me at 1000× speed. And I want to be relevant and important. Therefore we should permanently ban anything far more smart / fast / generally competent / inexpensive than humans, including AGI and uploads and other digital minds and human cognitive enhancement.”
You can make that argument. I would even be a bit sympathetic. (…Although I think the prospect of humanity never ever creating superhuman AGI is so extremely remote that arguing over its desirability is somewhat moot.) But if that’s the argument you want to make, then you’re saying something pretty different from “Many other visions expressed online from both sides of the AI safety debate seem to want to force the idea of “digital life,” “digital minds” or “uploads” onto people.”. I think that quote is suggesting something different from the Luddite “I don’t want to be economically outcompeted” argument.
(You’re probably thinking: I am using the word “Luddite” because it has negative connotations and I’m secretly trying to throw negative vibes on this argument. That is not my intention. Luddite seems like the best term here. And I don’t see “Luddite” as having negative connotations anyway. I just see it as a category of positions / arguments, pointing at something true and important, but potentially able to be outweighed by other considerations.)
I think you’re mixing up “uploads are impossible”, “uploading people who want to be uploaded is bad”, and “forcibly uploading people whether they want that or not is bad”. These are all very different topics. In this context, I wonder whether you would have been better off splitting them up into different blog posts. At the very least, the title is a bit misleading.
And the third thing there (“forcibly uploading people whether they want that or not is bad”) is not controversial. You say that some people are in favor of universal uploading of everyone including people who don’t want to be uploaded, but none of your links are to people who endorse that position. That’s a pretty crazy position.
I dunno, like, I don’t want to minimize the trauma of spinal injury, but my understanding is that people who become quadriplegic are still recognizably the same people, and they still feel like the same people, and their friends and family still see them as the same people, especially once they get over the initial shock, and the sudden wrenching changes in their day-to-day life and career aspirations, etc. I’m open to being corrected on that.
The first one. I think the brain is a machine, and it’s not such a complicated machine as to be forever beyond human comprehension—after all it has to be built by a mere 25,000 genes. Some things the machine does by design, and some things it does by chance. Like, “I am me” regardless of whether I’m in a clean room at 20°C or a dusty room at 21°C, but the dust and temperature have a zillion consequences on my quantum-mechanical state. So whatever “I am me” means, it must be only dependent on degrees of freedom that are pretty robust to environmental perturbations. And that means we have a hope of measuring them. (If we know how it works, and hence what degrees of freedom we need to measure!)
It’s a bit like measuring a computer chip—you don’t need to map every atom to be able to emulate it. You won’t emulate SEUs that way, but you didn’t really want to emulate the SEUs anyway.
There is a lot to unpack. I have definitely heard from leaders of the community claims to the tune of “biology is over,” without further explanation of what exactly that means or what specific steps are expected to happen when the majority of people disagree with this. The lack of clarity here makes it hard to find a specific claim of “I will forcefully do stuff to people they don’t like,” but me simply saying “I and others want to actually have what we think of as “humans” keep on living” is met with some pushback.
You seem to be saying that the “I” or “Self” of people is somehow static through large possible changes to the body. While on a social and legal level (family and friends recognize them), we need to have a simple shorthand for what constitutes the same person. The social level is not the same as the “molecular level.”
On a molecular level, everything impacts cognition. Good vs bad food impacts cognition, taking a cold vs warm shower impacts cognition. If you read Impro, even putting on a complicated mask during a theater performance impacts cognition.
“I am me,” whatever you think of “as yourself” is a product of your quantum-mechanical state. The body fights really hard to preserve some aspects of said state to be invariant. If the temperature of the room increases 1C nothing much might change, however, if the body loses the battle and your core temperature increases 1C, you likely have either a fever or heat-related problems with the corresponding impact on cognition. Even if the room is dusty enough, people can become distressed from the sight or lack of oxygen.
So if you claim that a small portion of molecular information is relevant in the construction of self, you will fail to capture all the factors that are relevant in affecting cognition and behavior. Now only considering a portion of the body’s molecules doesn’t solve the physics problem of needing to have a molecular level info without destroying the body. You would also need to hope that the relevant information is more “macro-scale” than molecules to get around the thermodynamics issues. However, every approximation one makes away from perfect simulation is likely to drift the cognition and behavior further from the person, which makes the verification problem (did it actually succeed) harder.
This is also why it’s a single post. The problems form a “stack” in which fuzzy or approximate solutions to the bottom of the stack make the problems above harder in the other layers of the stack.
Now, there is a particular molecular level worth mentioning. The DNA of people is the most stable molecular construct in the body. This is preserved by the body with far more care than whatever we think of as cognition. How much cognition is shared between a newborn and the same 80-year old? DNA is also build with redundancies which means that the majority of the body remains intact after a piece of it is collected with DNA. However, i don’t think that “write one’s DNA to the blockchain” is what people think of when they say uploads.
I am very highly confident that “leaders of the community” would be unhappy with a future where everyone who wants to live out their lives as biological humans are unable to do so. I don’t know what you heard, or from who, but you must have misunderstood it.
I think it’s possible that said biological humans will find that they have no gainful employment opportunities, because anything they can do, can alternatively be done by a robot who charges $0.01/hour and does a much better job. If that turns out to be the case, I hope that Universal Basic Income will enable those people to have a long rich “early retirement” full of travel, learning, friendship, family, community, or whatever suits them.
I also think it’s pretty likely that AI will wipe out the biological humans, using plagues and missile strikes and so on. In the unlikely event that there are human uploads, I would expect them to get killed by those same AIs as well. Obviously, I’m not happy to have that belief, and I am working to make it not come true.
Speaking of which, predicting that something will happen is different from (in fact, unrelated to) hoping that it will happen. I’ve never quite wrapped my mind around the fact that people mix these two things up so often. But that’s not a mistake that “community leaders” would make. I wonder if the “biology is over” claim was a prediction that you mistook as being a hope? By the same token, “uploading is bad” and “uploading is impossible” are not two layers of the same stack, they’re two unrelated claims. All four combinations (bad+impossible, good+impossible, bad+possible, good+possible) are perfectly coherent positions for a person to hold.
Robin’s whole Age of Em is basically pronouncing “biology is over” in a cheerful way.
Some posts from Nate:
here
I don’t want that, instead i want a tool intelligence that augments me by looking at my words and actions. Digital minds (not including “uploads”) are certainly possible and highly undesirable for most people simply due to competition for resources and higher potential for conflict. I don’t buy lack-of-resource-scarcity for a second.
here
“cradle bags of meat” is a pretty revealing phrase about what he thinks of actual humans and biology
In general, the idea to having regular people now and in the future have any say about the future of digital minds seems like an anathema here. There is no acknowledgement that this is the MINORITY position and that there is NO REASON that other people would go along with that. I don’t know how to interpret these pronouncements that go against the will of the people other than a full-blown intention to use state violence against people who disagree. Even if you can convince one nation to brutally suppress protests against digital minds, doesn’t mean others will follow suit.
here
“non-carbon-chauvinism” is such a funny attempt at an insult. You have already made up an insult for not believing in something that doesn’t exist. Typical atheism:).
The whole phrase comes off as “people without my exact brand of really weird ideas” are wrong and not invited to the club. You can exclude people all you want, just don’t claim that anything like this represents actual human values. I take this with the same level of seriousness as me pronouncing “atheism has no place in technology because it does not have the favor of the Machine God”
These are only public pronouncements…
None of those say (or imply) that we should forcibly upload people who don’t want to be uploaded. I think nobody believes that, and I think you should edit your post to not suggest that people do.
By analogy:
I can believe that people who don’t want to go to Mars are missing out on a great experience, but that doesn’t mean I’m in favor of forcing people who don’t want to go to Mars to go to Mars.
I can desire for it to be possible to go to Mars, but that doesn’t mean I’m in favor of forcing people who don’t want to go to Mars to go to Mars.
I can advocate for future Martians to not be under tyrannical control of Earthlings, with no votes or political rights, but that doesn’t mean I’m in favor of forcing people who don’t want to go to Mars to go to Mars.
I can believe that the vast majority of future humans will be Martians and all future technology will be invented by them, but that doesn’t mean I’m in favor of forcing people who don’t want to go to Mars to go to Mars.
Right?
The two people you cite have very strong libertarian tendencies. They do NOT have the belief “something is a good idea for an individual” ---> ”...therefore obviously the government should force everyone to do it” (a belief that has infected other parts of political discourse, a.k.a. everything must be either mandatory or forbidden).
If your belief is “in the unlikely event that uploading is possible at all, and somebody wants to upload, then the government should prevent them from doing so”—as it seems to be—then you should say that explicitly, and then readers can see for themselves which side of this debate is in favor of people imposing their preferences on other people.
Robin is a libertarian, Nate used to be, but after the whole calls to “bomb datacenters” and vague “regulation,” calls from the camp, i don’t buy libertarian credentials.
A specific term of “cradle bags of meat” is de-humanization. Many people view dehumanization as evidence of violent intentions. I understand you do not, but can you step away and realize that some people are quite sensitive to the phrasing?
More-over when i say “forcefully do stuff to people they don’t like”, this is a general problem. You seem to interpret this as only taking about “forcing people to be uploaded” which is a specific sub-problem. There are many other instances of this general problem which i refere to such as
a) forcing people to take care of economically unproductive digital minds.
it’s clear that Nate sees “American tax-payers” with contempt. However depending on the specific economics of digital minds, people are wary of being forced to give up resources to something they don’t care about.
or
b) ignoring existing people’s will with regards to dividing the cosmic endownment.
In your analogy, if a small group of people wants to go to Mars and take a small piece of it, that’s ok. However if they wish to go to Mars and then forcefully prevent other people from going there at all because they claim they need all of Mars to run some computation, this is not ok.
It’s clear Nate doesn’t see any problem with that with the interstellar probes comment.
Again, the general issue here, is that I am AWARE of the disagreement about whether on not digital minds (i am not taking about uploads, but other simpler to make categories of digital minds) are ok to get created. Despite me being in the majority position of “only create tool-AIs”, I am acknowledging that people might disagree. There are ways to resolve this peacefully (divide the star systems between groups). However, despite being in the minority position LW seems to wish to IGNORE every one else’s vision of the future and call them insults like “bags of meat” and “carbon—chauvinists”.
I think when you say “force the idea of “digital life,” “digital minds” or “uploads” onto people” and such, you are implying that there are people who are in favor of uploading everyone including people who don’t want to be uploaded. If that’s not what you believe, then I think you should change the wording.
This isn’t about vibes, it’s about what people actually say, and what they believe. I think you are misreading vibes in various ways, and therefore you should stick to what they actually say. It’s not like Robin Hanson and Eliezer are shy about writing down what they think in excruciating detail online. And they do not say that we should upload everyone including people who don’t want to be uploaded. For example, here’s an Age of Em quote which (I claim) is representative of what Robin says elsewhere:
Note that he doesn’t say is that these “some” are a problem and we need to fix this problem by force of law. Here’s another quote:
Emphasis added. There is nothing in the book that says we should or will forcibly upload people who don’t want to be uploaded, and at least this one passage explicitly to the contrary (I think there are other passages along the same lines).
I’m confused. In our analogy (uploading ↔ going to Mars), “go to Mars and then forcefully prevent other people from going there” would correspond to “upload and then forcefully prevent other people from uploading”. Since when does Nate want to prevent people from uploading? That’s the opposite of what he wants.
I’m not sure why you expect digital minds to be unproductive. Well, I guess in a post-AGI era, I would expect both humans and uploads to be equally economically unproductive. Is that what you’re saying?
I agree that a superintelligent AGI sovereign shouldn’t give equal Universal Basic Income shares to each human and each digital mind while also allowing one person to make a gazillion uploaded copies of themselves which then get a gazillion shares while the biological humans only get one share each. That’s just basic fairness. But if one person switches from a physical body to an upload, that’s not taking shares away from anyone.
There’s a legitimate (Luddite) position that says “I am a normal human running at human speed in a human body. And I do not want to be economically outcompeted. And I don’t want to be unemployable. And I don’t want to sit on the sidelines while history swooshes by me at 1000× speed. And I want to be relevant and important. Therefore we should permanently ban anything far more smart / fast / generally competent / inexpensive than humans, including AGI and uploads and other digital minds and human cognitive enhancement.”
You can make that argument. I would even be a bit sympathetic. (…Although I think the prospect of humanity never ever creating superhuman AGI is so extremely remote that arguing over its desirability is somewhat moot.) But if that’s the argument you want to make, then you’re saying something pretty different from “Many other visions expressed online from both sides of the AI safety debate seem to want to force the idea of “digital life,” “digital minds” or “uploads” onto people.”. I think that quote is suggesting something different from the Luddite “I don’t want to be economically outcompeted” argument.
(You’re probably thinking: I am using the word “Luddite” because it has negative connotations and I’m secretly trying to throw negative vibes on this argument. That is not my intention. Luddite seems like the best term here. And I don’t see “Luddite” as having negative connotations anyway. I just see it as a category of positions / arguments, pointing at something true and important, but potentially able to be outweighed by other considerations.)