I DON’T CARE about your hell reasoning. I AM ALREADY FIGHTING for my future, don’t you dare decide you know so much better that you won’t accept the risk that I might have some measure that suffers. If you want good things for yourself, update your moral theory to get it out of my face. Again: if you try to kill me, I will try to kill you back, with as much extra pain as I think is necessary to make you-now fear the outcome.
Maybe some people would rather kill themselves than risk this outcome. That’s up to them. But don’t you force it on me, or goddamn else.
I do care about his reasoning, and disagree with it (most notably the “any torture → infinite torture” part, with no counterbalancing “any pleasure → ?) term in the calculation.
but I’m with Iahwran on the conclusion: destroying the last copy of someone is especially heinous, and nowhere near justified by your reasoning. I’ll join his precommittment to punish you if you commit crimes in pursuit of these wrong beliefs (note: plain old retroactive punishment, nothing acausal here).
Under paragraph 2, destroying the last copy is especially heinous. That implies that you view replacing the death penalty in US states with ‘death followed by uploading into an indefinite long-term simulation of confinement’ to be less heinous? The status quo is to destroy the only copy of the mind in question.
Would it be justifiable to simulate prisoners with sentences they are expected to die prior to completing, so that they can live out their entire punitive terms and rejoin society as Ems?
That implies that you view replacing the death penalty in US states with ‘death followed by uploading into an indefinite long-term simulation of confinement’ to be less heinous?
Clearly it’s less harsh, and most convicts would prefer to experience incarceration for an indefinite time over a simple final death. This might change after a few hundred or million subjective years, but I don’t know—it probably depends on what activities the em has access to.
Whether it’s “heinous” is harder to say. Incarceration is a long way from torture, and I don’t know what the equilibrium effect on other criminals will be if it’s known that a formerly-capital offense now enables a massively extended lifespan, albeit in jail.
The suicide rate for incarcerated Americans is three times that of the general population, anecdotally, many death row inmates have expressed the desire to ‘hurry up with it’. Werner Herzog’s interviews of George Rivas and his co-conspirators are good examples of the sentiment. There’s still debate about the effectiveness of the death penalty as a deterrent to crime.
I suspect that some of these people may prefer the uncertain probability of confinement to hell by the divine, to the certain continuation of their sentences at the hands of the state.
Furthermore, an altruist working to further the cause of secure deletion may be preventing literal centuries of human misery. Why is this any less important than feeding the hungry, who at most will suffer for a proportion of a single lifetime?
Furthermore, an altruist working to further the cause of secure deletion may be preventing literal centuries of human misery. Why is this any less important than feeding the hungry, who at most will suffer for a proportion of a single lifetime?
You’re still looking only at the negative side of the equation. My goals are not solely to reduce suffering, but also to increase joy. Incarceration is not joy-free, and not (I think) even net negative for most inmates. Likewise your fears of an em future. It’s not joy-free, and while it may actually be negative for some ems, the probability space for ems in general is positive.
I therefore support suicide and secure erasure for any individual who reasonably believes themselves to be a significant outlier in terms of negative potential future outcomes, but strongly oppose the imposition of it on those who haven’t so chosen.
An effective altruist could probably very efficiently go about increasing the joy in the probability space for all humans by offering wireheading to a random human as resources permit, but it doesn’t do much for people who are proximately experiencing suffering for other reasons. I instinctively think that this wireheading example is an incorrect application of effective altruism, but I do think it is analagous to the ‘overall space is good’ argument.
Do you support assisted suicide for individuals incarcerated in hell simulations, or with a high probability of being placed into one subsequent to upload? For example, if a government develops a practice of execution followed by torment-simulation, would you support delivering the gift of secure deletion to the condemned?
I DON’T CARE about your hell reasoning. I AM ALREADY FIGHTING for my future, don’t you dare decide you know so much better that you won’t accept the risk that I might have some measure that suffers. If you want good things for yourself, update your moral theory to get it out of my face. Again: if you try to kill me, I will try to kill you back, with as much extra pain as I think is necessary to make you-now fear the outcome.
Maybe some people would rather kill themselves than risk this outcome. That’s up to them. But don’t you force it on me, or goddamn else.
I do care about his reasoning, and disagree with it (most notably the “any torture → infinite torture” part, with no counterbalancing “any pleasure → ?) term in the calculation.
but I’m with Iahwran on the conclusion: destroying the last copy of someone is especially heinous, and nowhere near justified by your reasoning. I’ll join his precommittment to punish you if you commit crimes in pursuit of these wrong beliefs (note: plain old retroactive punishment, nothing acausal here).
Under paragraph 2, destroying the last copy is especially heinous. That implies that you view replacing the death penalty in US states with ‘death followed by uploading into an indefinite long-term simulation of confinement’ to be less heinous? The status quo is to destroy the only copy of the mind in question.
Would it be justifiable to simulate prisoners with sentences they are expected to die prior to completing, so that they can live out their entire punitive terms and rejoin society as Ems?
Thank you for the challenging responses!
Clearly it’s less harsh, and most convicts would prefer to experience incarceration for an indefinite time over a simple final death. This might change after a few hundred or million subjective years, but I don’t know—it probably depends on what activities the em has access to.
Whether it’s “heinous” is harder to say. Incarceration is a long way from torture, and I don’t know what the equilibrium effect on other criminals will be if it’s known that a formerly-capital offense now enables a massively extended lifespan, albeit in jail.
The suicide rate for incarcerated Americans is three times that of the general population, anecdotally, many death row inmates have expressed the desire to ‘hurry up with it’. Werner Herzog’s interviews of George Rivas and his co-conspirators are good examples of the sentiment. There’s still debate about the effectiveness of the death penalty as a deterrent to crime.
I suspect that some of these people may prefer the uncertain probability of confinement to hell by the divine, to the certain continuation of their sentences at the hands of the state.
Furthermore, an altruist working to further the cause of secure deletion may be preventing literal centuries of human misery. Why is this any less important than feeding the hungry, who at most will suffer for a proportion of a single lifetime?
You’re still looking only at the negative side of the equation. My goals are not solely to reduce suffering, but also to increase joy. Incarceration is not joy-free, and not (I think) even net negative for most inmates. Likewise your fears of an em future. It’s not joy-free, and while it may actually be negative for some ems, the probability space for ems in general is positive.
I therefore support suicide and secure erasure for any individual who reasonably believes themselves to be a significant outlier in terms of negative potential future outcomes, but strongly oppose the imposition of it on those who haven’t so chosen.
I think I am addressing most of your position in this post here in response to HungryHobo: http://lesswrong.com/lw/os7/unethical_human_behavior_incentivised_by/dqfi And also the ‘overall probability space’ was mentioned by RobinHanson, and I addressed that in a comment too: http://lesswrong.com/lw/os7/unethical_human_behavior_incentivised_by/dq6x
Thank you for the thoughtful responses!
An effective altruist could probably very efficiently go about increasing the joy in the probability space for all humans by offering wireheading to a random human as resources permit, but it doesn’t do much for people who are proximately experiencing suffering for other reasons. I instinctively think that this wireheading example is an incorrect application of effective altruism, but I do think it is analagous to the ‘overall space is good’ argument.
Do you support assisted suicide for individuals incarcerated in hell simulations, or with a high probability of being placed into one subsequent to upload? For example, if a government develops a practice of execution followed by torment-simulation, would you support delivering the gift of secure deletion to the condemned?
(I’m confused about who “his” refers to in the first paragraph—I predict 90% redman and 9% me)
edit: figured it out on third reread. the first paragraph responds to me, the second paragraph responds to redman.