I would point out that the scenario I was writing about was clearly one in which ems are common and em society is stable. If you think that in such a society, there won’t be em kidnapping or hacking, for interrogation, slavery, or torture, you hold very different views from mine indeed. (If you think such a society won’t exist, that’s another thing entirely.)
I think such a society won’t exist. I think that much war and conflict in general can be reduced to unmet human needs. Those needs could be physical (land, oil, money) or emotional/ideological (theism, patriotism, other forms of irrationality). However, I think that if uploads were to happen, then we would be able to solve the other problems. Wouldn’t uploading imply the end of psychological disturbance? You wouldn’t need to resort to chemical altering of the brain. You could do it digitally. There would be no physical limitations on how you alter the data in your uploaded mind, whereas there are limitations as to chemically altering your mind today. No longer would you need to go on vacation, or buy videogames. You could just go to a virtual reality.
Even if the things I predict don’t happen (I use them more as examples than as predictions), the point is that trying to imagine the specific psychology of an upload is more speculative than new age metaphysics. However, I think that as far as we’re concerned, we can make extremely vague predictions like generally good (above) , or generally bad (kidnapping, hijacks, torture). Since generally bad things are generally do to psychological disturbance or lack of resources, my bet is on generally good.
There is kidnapping for interrogation, slavery and torture today, so there is no reason to believe there won’t be such in the future. But I don’t believe it will make sense in the future to commit suicide at the mere thought, any more than it does today.
As for whether such a society will exist, I think it’s possible it may. It’s possible there may come a day when people don’t have to die. And there is a better chance of that happening if we refrain from poisoning our minds with scare stories optimized for appeal to primate brains over correspondence to external reality.
I can make a reasonable estimate of the risk of being kidnapped or arrested and being tortured.
There’s a lot less information about the risk of ems being tortured, and such information may never be available, since I think it’s unlikely that computers can be monitored to that extent.
People do commit suicide to escape torture, but it doesn’t seem to be the most common response. Also, fake executions are considered to be a form of torture because of the fear of death. So far as I know, disappointment at finding out one hasn’t been killed isn’t considered to be part of what’s bad about fake executions.
“I can make a reasonable estimate of the risk of being kidnapped or arrested and being tortured.
“There’s a lot less information about the risk of ems being tortured, and such information may never be available, since I think it’s unlikely that computers can be monitored to that extend.”
If we can’t make a reasonable estimate, what estimate do we make? The discounted validity of the estimate is incorporated in the prior probability. (Actually I’m not sure if this always works, but a consistent Bayesianism must so hold. Please correct me if this is wrong.)
My reaction to the most neutral form of the question about downloading—”If offered the certain opportunity of success at no cost, would I accept?”—Is “No.” The basis is my fear that I wouldn’t like the result. I justify it—perhaps after the fact—by assigning an equal a priori likelihood to a good and bad outcome. In Nancy’s terms, I’m saying that we have no ability to make a reasonable estimate. The advantage of putting it my way is that it implies a conclusion, rather than resulting in agnosticism (but at the cost of a less certain justification).
In general, I think people over-value the continuation of life. One consequence is that people put too little effort into mitigating the circumstances of their death—which many times, involves inclining it to come sooner rather than later.
If we can’t make a reasonable estimate, what estimate do we make?
What’s the status of error bars in doing this sort of reasoning? It seems to me that a probability of .5 +/- epsilon (a coin you have very good reason to think is honest) is a very different thing from .5 +/- .3 (outcome of an election in a country about which you only know that they have elections and the names of the candidates).
I’m not sure +/- .3 is reasonable—I think I’m using it to represent that people familiar with that country might have a good idea who’d win.
With the possibility? Of course not. Anything that doesn’t involve a logical self-contradiction is possible. My disagreement is with the idea that it is sane or rational to base decisions on fantasies about being kidnapped and tortured in the absence of any evidence that this is at all likely to occur.
People are greedy. When people have the opportunity to exploit others, they often take it.
If anyone gets a hold of your em, they can torture your for subject aeons. Anyone who has a copy of your em can blackmail you: “Give me 99% of your property. For every minute you delay, I will torture your ems for a million subjective years.”
And what if someone actually wants to hurt you, instead of just exploit you? You and your romantic partner get in a fight. In a fit of passion, she leaves with a copy of your em. By the time the police find her the next day, you’ve been tortured for a subjective period of time longer than the universe.
Very few, perhaps no one, will have the engineering skill to upload a copy of themselves without someone else’s assistance. When you’re dead and Apple is uploading your iEm, you’re trusting Apple not to abuse you. Is anyone worthy of that trust? And even if you’re uploaded safely, how will you store backup copies? And how will you protect yourself against hackers?
If you postulate ems that can run a million subjective years a minute (which is not at all scientifically plausible), the mainline copies can do that as well, which means talking about wall clock time at all is misleading; the new subjective timescale is the appropriate one to use across the board.
As for the rest, people are just as greedy today as they will be in the future. Organized criminals could torture you until you agree to sign over your property to them. Your girlfriend could pour petrol over you and set you on fire while you’re asleep. If you sign up for a delivery or service with Apple and give them your home address, you’re trusting them not to send thugs around to your house and kidnap you. Ever fly on an airliner? Very few, perhaps no one, will have the engineering skill to fly without someone else’s assistance. When you’re on the plane, you’re trusting the airline not to deliver you to a torture camp. Is anyone worthy of that trust? And even if you get home safely, how will you stay safe while you’re asleep? And how will you protect yourself against criminals?
Does committing suicide today sound a more plausible idea now?
All of those scenarios are not only extremely inconvenient and not very profitable for the people involved, but also have high risks of getting caught. This means that the probability of any of them taking place is marginal, because the incentives just aren’t there in almost any situation. On the other hand, a digital file is hugely more easy to acquire, incarcerate, transport, and torture, and also easier to hide from any authorities. If someone gets their hands on a digital copy of you, torturing you for x period of time can be as easy as pressing a button. You might never kidnap an orchestra and force them to play for you, but millions of people download MP3s illegally.
I would still rather be uploaded rather than die, but I don’t think you’re giving the opposing point of view anything like the credit it deserves.
On the other hand, a digital file is hugely more easy to acquire, incarcerate, transport, and torture, and also easier to hide from any authorities. If someone gets their hands on a digital copy of you, torturing you for x period of time can be as easy as pressing a button.
If Y amount of computational resources can be used to simulate a million person-years, then the opportunity cost of using Y to torture someone is very large.
An upload, at least of the early generations, is going to require a supercomputer the size of a rather large building to run, to point out just one of the reasons why the analogy with playing a pirate MP3 is entirely spurious.
...the laws of physics as now understood would allow one gram (more or less) to store and run the entire human race at a million subjective years per second.
No one can hurt me today the way I could be hurt in a post-em world. In a world where human capacity for malevolence is higher, more precaution is required. One should not rule out suicide as a precaution against being tortured for subjective billions of years.
I’ve been snarky for this entire conversation—I find advocacy of death extremely irritating—but I am not just snarky by any means. The laws of physics as now understood allow no such thing, and even the author of the document to which you refer—a master of wishful thinking—now regards it as obsolete and wrong. And the point still holds—you cannot benefit today the way you could in a post-em world. If you’re prepared to throw away billions of years of life as a precaution against the possibility of billions of years of torture, you should be prepared to throw away decades of life as a precaution against the possibility of decades of torture. If you aren’t prepared to do the latter, you should reconsider the former.
I don’t think you’re a “bad guy”. I do think it’s a shame that you’re burying an important and interesting subject — the kind of goals and capabilities that it would be appropriate to encode in AI — under a mountain of hyperbole.
Also, in the absence of any evidence that this is at all unlikely to occur. But notice the original poster does not dwell on the probability of this scenario, only on its mere possibility. It seems to me you’re disagreeing with some phantasm you imported into the conversation.
Also, in the absence of any evidence that this is at all unlikely to occur.
If you think the situation is that symmetrical, you should be indifferent on the question of whether to commit suicide today.
But notice the original poster does not dwell on the probability of this scenario, only on its mere possibility.
If it had been generated as part of an exhaustive listing of all possible scenarios, I would have refrained from comment. As it is, being raised in the context of a discussion on whether one should try for uploading in the unlikely event one lives that long, it’s obviously intended to be an argument for a negative answer, which means it constitutes:
If you think the situation is that symmetrical, you should be indifferent on the question of whether to commit suicide today.
Do you have some actual data for me to update on? Otherwise, we’re just bickering over unjustifiable priors. That’s why I’m withholding judgment.
As it is, being raised in the context of a discussion on whether one should try for uploading in the unlikely event one lives that long, it’s obviously intended to be an argument for a negative answer
It did come out as this later, but not “obviously” from the original comment.
Not the correct counterargument. Your torturer merely needs to keep you alive, or possibly cryopreserved, until lengthening your natural lifespan becomes possible.
Which is not a plausible scenario in today’s world.
If em torture is viable in the future, and I don’t think I can defend myself, I will seriously consider suicide. But rwallace comment was regarding today’s world.
The comment holds regardless. In today’s world, you can only be tortured for a few decades, but by the same token you can only lose a few decades of lifespan by committing suicide. If in some future world you can be tortured for a billion years, then you will also be losing a billion years of happy healthy life by committing suicide. If you think the mere possibility of torture—with no evidence that it is at all likely—will be grounds for committing suicide in that future world, then you should think it equally good grounds for committing suicide today. If you agree with me that would be insanely irrational today, you should also agree it will be insanely irrational in that future world.
I, and I suspect the entire human species, is risk averse. Suppose I have to choose between to bets:
A: 50% chance of living 100 happy years. 50% chance of living 100 torture years.
B: 50% chance of living 1,000,0000 happy years, 50% chance of living 1,000,000 torture years.
I will pick the first because it has the better bad option. While additional happy years have diminishing additional utility, additional torture have increasing dis-utility. I would rather a 50% chance of being tortured for 10 years than a 10% chance of being tortured for 50 years.
When WBE is invented, the stakes will be upped. The good possibilities get much better, and the bad possibilities get much worse. As a risk averse person, this scares me.
At those ratios, absolutely. I’m not sure how to explain why, since it just seems obvious that suicide would be preferable to a 50% chance of being tortured for a century. (I’m not sure at what what ratio it would become a real dilemma.)
If you think that kind of argument holds water, you should commit suicide today lest a sadist kidnap you and torture you in real life.
I would point out that the scenario I was writing about was clearly one in which ems are common and em society is stable. If you think that in such a society, there won’t be em kidnapping or hacking, for interrogation, slavery, or torture, you hold very different views from mine indeed. (If you think such a society won’t exist, that’s another thing entirely.)
As a human, you can only die once.
I think such a society won’t exist. I think that much war and conflict in general can be reduced to unmet human needs. Those needs could be physical (land, oil, money) or emotional/ideological (theism, patriotism, other forms of irrationality). However, I think that if uploads were to happen, then we would be able to solve the other problems. Wouldn’t uploading imply the end of psychological disturbance? You wouldn’t need to resort to chemical altering of the brain. You could do it digitally. There would be no physical limitations on how you alter the data in your uploaded mind, whereas there are limitations as to chemically altering your mind today. No longer would you need to go on vacation, or buy videogames. You could just go to a virtual reality.
Even if the things I predict don’t happen (I use them more as examples than as predictions), the point is that trying to imagine the specific psychology of an upload is more speculative than new age metaphysics. However, I think that as far as we’re concerned, we can make extremely vague predictions like generally good (above) , or generally bad (kidnapping, hijacks, torture). Since generally bad things are generally do to psychological disturbance or lack of resources, my bet is on generally good.
There is kidnapping for interrogation, slavery and torture today, so there is no reason to believe there won’t be such in the future. But I don’t believe it will make sense in the future to commit suicide at the mere thought, any more than it does today.
As for whether such a society will exist, I think it’s possible it may. It’s possible there may come a day when people don’t have to die. And there is a better chance of that happening if we refrain from poisoning our minds with scare stories optimized for appeal to primate brains over correspondence to external reality.
At least, not unless you are an upload and your suicide trigger is the equivalent of those tripwires that burn the contents of the safe.
I can make a reasonable estimate of the risk of being kidnapped or arrested and being tortured.
There’s a lot less information about the risk of ems being tortured, and such information may never be available, since I think it’s unlikely that computers can be monitored to that extent.
People do commit suicide to escape torture, but it doesn’t seem to be the most common response. Also, fake executions are considered to be a form of torture because of the fear of death. So far as I know, disappointment at finding out one hasn’t been killed isn’t considered to be part of what’s bad about fake executions.
“I can make a reasonable estimate of the risk of being kidnapped or arrested and being tortured.
“There’s a lot less information about the risk of ems being tortured, and such information may never be available, since I think it’s unlikely that computers can be monitored to that extend.”
If we can’t make a reasonable estimate, what estimate do we make? The discounted validity of the estimate is incorporated in the prior probability. (Actually I’m not sure if this always works, but a consistent Bayesianism must so hold. Please correct me if this is wrong.)
My reaction to the most neutral form of the question about downloading—”If offered the certain opportunity of success at no cost, would I accept?”—Is “No.” The basis is my fear that I wouldn’t like the result. I justify it—perhaps after the fact—by assigning an equal a priori likelihood to a good and bad outcome. In Nancy’s terms, I’m saying that we have no ability to make a reasonable estimate. The advantage of putting it my way is that it implies a conclusion, rather than resulting in agnosticism (but at the cost of a less certain justification).
In general, I think people over-value the continuation of life. One consequence is that people put too little effort into mitigating the circumstances of their death—which many times, involves inclining it to come sooner rather than later.
What’s the status of error bars in doing this sort of reasoning? It seems to me that a probability of .5 +/- epsilon (a coin you have very good reason to think is honest) is a very different thing from .5 +/- .3 (outcome of an election in a country about which you only know that they have elections and the names of the candidates).
I’m not sure +/- .3 is reasonable—I think I’m using it to represent that people familiar with that country might have a good idea who’d win.
Do you have some substantial disagreement with the possibility of the scenario?
With the possibility? Of course not. Anything that doesn’t involve a logical self-contradiction is possible. My disagreement is with the idea that it is sane or rational to base decisions on fantasies about being kidnapped and tortured in the absence of any evidence that this is at all likely to occur.
Evidence:
People are greedy. When people have the opportunity to exploit others, they often take it.
If anyone gets a hold of your em, they can torture your for subject aeons. Anyone who has a copy of your em can blackmail you: “Give me 99% of your property. For every minute you delay, I will torture your ems for a million subjective years.”
And what if someone actually wants to hurt you, instead of just exploit you? You and your romantic partner get in a fight. In a fit of passion, she leaves with a copy of your em. By the time the police find her the next day, you’ve been tortured for a subjective period of time longer than the universe.
Very few, perhaps no one, will have the engineering skill to upload a copy of themselves without someone else’s assistance. When you’re dead and Apple is uploading your iEm, you’re trusting Apple not to abuse you. Is anyone worthy of that trust? And even if you’re uploaded safely, how will you store backup copies? And how will you protect yourself against hackers?
Sound more plausible now?
If you postulate ems that can run a million subjective years a minute (which is not at all scientifically plausible), the mainline copies can do that as well, which means talking about wall clock time at all is misleading; the new subjective timescale is the appropriate one to use across the board.
As for the rest, people are just as greedy today as they will be in the future. Organized criminals could torture you until you agree to sign over your property to them. Your girlfriend could pour petrol over you and set you on fire while you’re asleep. If you sign up for a delivery or service with Apple and give them your home address, you’re trusting them not to send thugs around to your house and kidnap you. Ever fly on an airliner? Very few, perhaps no one, will have the engineering skill to fly without someone else’s assistance. When you’re on the plane, you’re trusting the airline not to deliver you to a torture camp. Is anyone worthy of that trust? And even if you get home safely, how will you stay safe while you’re asleep? And how will you protect yourself against criminals?
Does committing suicide today sound a more plausible idea now?
All of those scenarios are not only extremely inconvenient and not very profitable for the people involved, but also have high risks of getting caught. This means that the probability of any of them taking place is marginal, because the incentives just aren’t there in almost any situation. On the other hand, a digital file is hugely more easy to acquire, incarcerate, transport, and torture, and also easier to hide from any authorities. If someone gets their hands on a digital copy of you, torturing you for x period of time can be as easy as pressing a button. You might never kidnap an orchestra and force them to play for you, but millions of people download MP3s illegally.
I would still rather be uploaded rather than die, but I don’t think you’re giving the opposing point of view anything like the credit it deserves.
If Y amount of computational resources can be used to simulate a million person-years, then the opportunity cost of using Y to torture someone is very large.
An upload, at least of the early generations, is going to require a supercomputer the size of a rather large building to run, to point out just one of the reasons why the analogy with playing a pirate MP3 is entirely spurious.
Now you’re just getting snarky.
This document is a bit old, but:
No one can hurt me today the way I could be hurt in a post-em world. In a world where human capacity for malevolence is higher, more precaution is required. One should not rule out suicide as a precaution against being tortured for subjective billions of years.
I’ve been snarky for this entire conversation—I find advocacy of death extremely irritating—but I am not just snarky by any means. The laws of physics as now understood allow no such thing, and even the author of the document to which you refer—a master of wishful thinking—now regards it as obsolete and wrong. And the point still holds—you cannot benefit today the way you could in a post-em world. If you’re prepared to throw away billions of years of life as a precaution against the possibility of billions of years of torture, you should be prepared to throw away decades of life as a precaution against the possibility of decades of torture. If you aren’t prepared to do the latter, you should reconsider the former.
I rather subscribe to how Greg Egan describes what the author is doing:
Also, in the absence of any evidence that this is at all unlikely to occur. But notice the original poster does not dwell on the probability of this scenario, only on its mere possibility. It seems to me you’re disagreeing with some phantasm you imported into the conversation.
If you think the situation is that symmetrical, you should be indifferent on the question of whether to commit suicide today.
If it had been generated as part of an exhaustive listing of all possible scenarios, I would have refrained from comment. As it is, being raised in the context of a discussion on whether one should try for uploading in the unlikely event one lives that long, it’s obviously intended to be an argument for a negative answer, which means it constitutes:
http://lesswrong.com/lw/19m/privileging_the_hypothesis/
Advocacy of death.
Do you have some actual data for me to update on? Otherwise, we’re just bickering over unjustifiable priors. That’s why I’m withholding judgment.
It did come out as this later, but not “obviously” from the original comment.
My physical body can only be tortured a few decades, tops. An em can be tortured for a billion years, along with a billion em copies of myself.
Not the correct counterargument. Your torturer merely needs to keep you alive, or possibly cryopreserved, until lengthening your natural lifespan becomes possible.
Which is not a plausible scenario in today’s world.
If em torture is viable in the future, and I don’t think I can defend myself, I will seriously consider suicide. But rwallace comment was regarding today’s world.
The comment holds regardless. In today’s world, you can only be tortured for a few decades, but by the same token you can only lose a few decades of lifespan by committing suicide. If in some future world you can be tortured for a billion years, then you will also be losing a billion years of happy healthy life by committing suicide. If you think the mere possibility of torture—with no evidence that it is at all likely—will be grounds for committing suicide in that future world, then you should think it equally good grounds for committing suicide today. If you agree with me that would be insanely irrational today, you should also agree it will be insanely irrational in that future world.
I, and I suspect the entire human species, is risk averse. Suppose I have to choose between to bets:
A: 50% chance of living 100 happy years. 50% chance of living 100 torture years.
B: 50% chance of living 1,000,0000 happy years, 50% chance of living 1,000,000 torture years.
I will pick the first because it has the better bad option. While additional happy years have diminishing additional utility, additional torture have increasing dis-utility. I would rather a 50% chance of being tortured for 10 years than a 10% chance of being tortured for 50 years.
When WBE is invented, the stakes will be upped. The good possibilities get much better, and the bad possibilities get much worse. As a risk averse person, this scares me.
Would you prefer
C: 50% chance of living 1 happy minute. 50% chance of living 1 torture minute.
over both? If not, why not?
At those ratios, absolutely. I’m not sure how to explain why, since it just seems obvious that suicide would be preferable to a 50% chance of being tortured for a century. (I’m not sure at what what ratio it would become a real dilemma.)