Robin is a libertarian, Nate used to be, but after the whole calls to “bomb datacenters” and vague “regulation,” calls from the camp, i don’t buy libertarian credentials.
A specific term of “cradle bags of meat” is de-humanization. Many people view dehumanization as evidence of violent intentions. I understand you do not, but can you step away and realize that some people are quite sensitive to the phrasing?
More-over when i say “forcefully do stuff to people they don’t like”, this is a general problem. You seem to interpret this as only taking about “forcing people to be uploaded” which is a specific sub-problem. There are many other instances of this general problem which i refere to such as
a) forcing people to take care of economically unproductive digital minds.
it’s clear that Nate sees “American tax-payers” with contempt. However depending on the specific economics of digital minds, people are wary of being forced to give up resources to something they don’t care about.
or
b) ignoring existing people’s will with regards to dividing the cosmic endownment.
In your analogy, if a small group of people wants to go to Mars and take a small piece of it, that’s ok. However if they wish to go to Mars and then forcefully prevent other people from going there at all because they claim they need all of Mars to run some computation, this is not ok.
It’s clear Nate doesn’t see any problem with that with the interstellar probes comment.
Again, the general issue here, is that I am AWARE of the disagreement about whether on not digital minds (i am not taking about uploads, but other simpler to make categories of digital minds) are ok to get created. Despite me being in the majority position of “only create tool-AIs”, I am acknowledging that people might disagree. There are ways to resolve this peacefully (divide the star systems between groups). However, despite being in the minority position LW seems to wish to IGNORE every one else’s vision of the future and call them insults like “bags of meat” and “carbon—chauvinists”.
I think when you say “force the idea of “digital life,” “digital minds” or “uploads” onto people” and such, you are implying that there are people who are in favor of uploading everyone including people who don’t want to be uploaded. If that’s not what you believe, then I think you should change the wording.
This isn’t about vibes, it’s about what people actually say, and what they believe. I think you are misreading vibes in various ways, and therefore you should stick to what they actually say. It’s not like Robin Hanson and Eliezer are shy about writing down what they think in excruciating detail online. And they do not say that we should upload everyone including people who don’t want to be uploaded. For example, here’s an Age of Em quote which (I claim) is representative of what Robin says elsewhere:
Some celebrate our biologically maladaptive behaviors without hoping for collective control of evolution. They accept that future evolution will select for preferences different from theirs, but they still want to act on the preferences they have for as long as they have them. These people have embraced a role as temporary dreamtime exceptions to a larger pattern of history.
Note that he doesn’t say is that these “some” are a problem and we need to fix this problem by force of law. Here’s another quote:
Attempts to limit the freedom of such young people to voluntarily choose destructive scanning could result in big conflicts.
Later on, when scans become non-destructive and scanning costs fall, scans are done on far more people, including both old people with proven productivity and adaptability, and younger people with great promise to later become productive and adaptable. Eventually most humans willing to be scanned are scanned, to provide a large pool of scans to search for potentially productive workers. By then, many early scans may have gained first-mover advantages over late arrivals. First movers will have adapted more to em environments, and other ems and other systems will have adapted more to them.
Emphasis added. There is nothing in the book that says we should or will forcibly upload people who don’t want to be uploaded, and at least this one passage explicitly to the contrary (I think there are other passages along the same lines).
if they wish to go to Mars and then forcefully prevent other people from going there at all because they claim they need all of Mars to run some computation, this is not ok.
It’s clear Nate doesn’t see any problem with that with the interstellar probes comment.
I’m confused. In our analogy (uploading ↔ going to Mars), “go to Mars and then forcefully prevent other people from going there” would correspond to “upload and then forcefully prevent other people from uploading”. Since when does Nate want to prevent people from uploading? That’s the opposite of what he wants.
forcing people to take care of economically unproductive digital minds
I’m not sure why you expect digital minds to be unproductive. Well, I guess in a post-AGI era, I would expect both humans and uploads to be equally economically unproductive. Is that what you’re saying?
I agree that a superintelligent AGI sovereign shouldn’t give equal Universal Basic Income shares to each human and each digital mind while also allowing one person to make a gazillion uploaded copies of themselves which then get a gazillion shares while the biological humans only get one share each. That’s just basic fairness. But if one person switches from a physical body to an upload, that’s not taking shares away from anyone.
ignoring existing people’s will with regards to dividing the cosmic endownment
There’s a legitimate (Luddite) position that says “I am a normal human running at human speed in a human body. And I do not want to be economically outcompeted. And I don’t want to be unemployable. And I don’t want to sit on the sidelines while history swooshes by me at 1000× speed. And I want to be relevant and important. Therefore we should permanently ban anything far more smart / fast / generally competent / inexpensive than humans, including AGI and uploads and other digital minds and human cognitive enhancement.”
You can make that argument. I would even be a bit sympathetic. (…Although I think the prospect of humanity never ever creating superhuman AGI is so extremely remote that arguing over its desirability is somewhat moot.) But if that’s the argument you want to make, then you’re saying something pretty different from “Many other visions expressed online from both sides of the AI safety debate seem to want to force the idea of “digital life,” “digital minds” or “uploads” onto people.”. I think that quote is suggesting something different from the Luddite “I don’t want to be economically outcompeted” argument.
(You’re probably thinking: I am using the word “Luddite” because it has negative connotations and I’m secretly trying to throw negative vibes on this argument. That is not my intention. Luddite seems like the best term here. And I don’t see “Luddite” as having negative connotations anyway. I just see it as a category of positions / arguments, pointing at something true and important, but potentially able to be outweighed by other considerations.)
Robin is a libertarian, Nate used to be, but after the whole calls to “bomb datacenters” and vague “regulation,” calls from the camp, i don’t buy libertarian credentials.
A specific term of “cradle bags of meat” is de-humanization. Many people view dehumanization as evidence of violent intentions. I understand you do not, but can you step away and realize that some people are quite sensitive to the phrasing?
More-over when i say “forcefully do stuff to people they don’t like”, this is a general problem. You seem to interpret this as only taking about “forcing people to be uploaded” which is a specific sub-problem. There are many other instances of this general problem which i refere to such as
a) forcing people to take care of economically unproductive digital minds.
it’s clear that Nate sees “American tax-payers” with contempt. However depending on the specific economics of digital minds, people are wary of being forced to give up resources to something they don’t care about.
or
b) ignoring existing people’s will with regards to dividing the cosmic endownment.
In your analogy, if a small group of people wants to go to Mars and take a small piece of it, that’s ok. However if they wish to go to Mars and then forcefully prevent other people from going there at all because they claim they need all of Mars to run some computation, this is not ok.
It’s clear Nate doesn’t see any problem with that with the interstellar probes comment.
Again, the general issue here, is that I am AWARE of the disagreement about whether on not digital minds (i am not taking about uploads, but other simpler to make categories of digital minds) are ok to get created. Despite me being in the majority position of “only create tool-AIs”, I am acknowledging that people might disagree. There are ways to resolve this peacefully (divide the star systems between groups). However, despite being in the minority position LW seems to wish to IGNORE every one else’s vision of the future and call them insults like “bags of meat” and “carbon—chauvinists”.
I think when you say “force the idea of “digital life,” “digital minds” or “uploads” onto people” and such, you are implying that there are people who are in favor of uploading everyone including people who don’t want to be uploaded. If that’s not what you believe, then I think you should change the wording.
This isn’t about vibes, it’s about what people actually say, and what they believe. I think you are misreading vibes in various ways, and therefore you should stick to what they actually say. It’s not like Robin Hanson and Eliezer are shy about writing down what they think in excruciating detail online. And they do not say that we should upload everyone including people who don’t want to be uploaded. For example, here’s an Age of Em quote which (I claim) is representative of what Robin says elsewhere:
Note that he doesn’t say is that these “some” are a problem and we need to fix this problem by force of law. Here’s another quote:
Emphasis added. There is nothing in the book that says we should or will forcibly upload people who don’t want to be uploaded, and at least this one passage explicitly to the contrary (I think there are other passages along the same lines).
I’m confused. In our analogy (uploading ↔ going to Mars), “go to Mars and then forcefully prevent other people from going there” would correspond to “upload and then forcefully prevent other people from uploading”. Since when does Nate want to prevent people from uploading? That’s the opposite of what he wants.
I’m not sure why you expect digital minds to be unproductive. Well, I guess in a post-AGI era, I would expect both humans and uploads to be equally economically unproductive. Is that what you’re saying?
I agree that a superintelligent AGI sovereign shouldn’t give equal Universal Basic Income shares to each human and each digital mind while also allowing one person to make a gazillion uploaded copies of themselves which then get a gazillion shares while the biological humans only get one share each. That’s just basic fairness. But if one person switches from a physical body to an upload, that’s not taking shares away from anyone.
There’s a legitimate (Luddite) position that says “I am a normal human running at human speed in a human body. And I do not want to be economically outcompeted. And I don’t want to be unemployable. And I don’t want to sit on the sidelines while history swooshes by me at 1000× speed. And I want to be relevant and important. Therefore we should permanently ban anything far more smart / fast / generally competent / inexpensive than humans, including AGI and uploads and other digital minds and human cognitive enhancement.”
You can make that argument. I would even be a bit sympathetic. (…Although I think the prospect of humanity never ever creating superhuman AGI is so extremely remote that arguing over its desirability is somewhat moot.) But if that’s the argument you want to make, then you’re saying something pretty different from “Many other visions expressed online from both sides of the AI safety debate seem to want to force the idea of “digital life,” “digital minds” or “uploads” onto people.”. I think that quote is suggesting something different from the Luddite “I don’t want to be economically outcompeted” argument.
(You’re probably thinking: I am using the word “Luddite” because it has negative connotations and I’m secretly trying to throw negative vibes on this argument. That is not my intention. Luddite seems like the best term here. And I don’t see “Luddite” as having negative connotations anyway. I just see it as a category of positions / arguments, pointing at something true and important, but potentially able to be outweighed by other considerations.)