Furthermore, an altruist working to further the cause of secure deletion may be preventing literal centuries of human misery. Why is this any less important than feeding the hungry, who at most will suffer for a proportion of a single lifetime?
You’re still looking only at the negative side of the equation. My goals are not solely to reduce suffering, but also to increase joy. Incarceration is not joy-free, and not (I think) even net negative for most inmates. Likewise your fears of an em future. It’s not joy-free, and while it may actually be negative for some ems, the probability space for ems in general is positive.
I therefore support suicide and secure erasure for any individual who reasonably believes themselves to be a significant outlier in terms of negative potential future outcomes, but strongly oppose the imposition of it on those who haven’t so chosen.
An effective altruist could probably very efficiently go about increasing the joy in the probability space for all humans by offering wireheading to a random human as resources permit, but it doesn’t do much for people who are proximately experiencing suffering for other reasons. I instinctively think that this wireheading example is an incorrect application of effective altruism, but I do think it is analagous to the ‘overall space is good’ argument.
Do you support assisted suicide for individuals incarcerated in hell simulations, or with a high probability of being placed into one subsequent to upload? For example, if a government develops a practice of execution followed by torment-simulation, would you support delivering the gift of secure deletion to the condemned?
You’re still looking only at the negative side of the equation. My goals are not solely to reduce suffering, but also to increase joy. Incarceration is not joy-free, and not (I think) even net negative for most inmates. Likewise your fears of an em future. It’s not joy-free, and while it may actually be negative for some ems, the probability space for ems in general is positive.
I therefore support suicide and secure erasure for any individual who reasonably believes themselves to be a significant outlier in terms of negative potential future outcomes, but strongly oppose the imposition of it on those who haven’t so chosen.
I think I am addressing most of your position in this post here in response to HungryHobo: http://lesswrong.com/lw/os7/unethical_human_behavior_incentivised_by/dqfi And also the ‘overall probability space’ was mentioned by RobinHanson, and I addressed that in a comment too: http://lesswrong.com/lw/os7/unethical_human_behavior_incentivised_by/dq6x
Thank you for the thoughtful responses!
An effective altruist could probably very efficiently go about increasing the joy in the probability space for all humans by offering wireheading to a random human as resources permit, but it doesn’t do much for people who are proximately experiencing suffering for other reasons. I instinctively think that this wireheading example is an incorrect application of effective altruism, but I do think it is analagous to the ‘overall space is good’ argument.
Do you support assisted suicide for individuals incarcerated in hell simulations, or with a high probability of being placed into one subsequent to upload? For example, if a government develops a practice of execution followed by torment-simulation, would you support delivering the gift of secure deletion to the condemned?