This actually reminds me of an argument I had with some Negative-Leaning Utilitarians on the old Felicifia forums. Basically, a common concern for them was how r-selected species tend to appear to suffer way more than be happy, generally speaking, and that this can imply that was should try to reduce the suffering by eliminating those species or at least avoiding the expansion of life generally to other planets.
I likened this line of reasoning to the idea that we should Nuke The Rainforest.
Personally I think a similar counterargument to that argument applies here as well. Translated into your thought experiment, it would be In essence, that while it is true that some percentage of minds will probably end up being tortured by sadists, this is likely to be outweighed by the sheer number of minds that are even more likely to be uploaded into some kind of utopian paradise. Given that truly psychopathic sadism is actually quite rare in the general population, one would expect a very similar ratio of simulations. In the long run, the optimistic view is that decency will prevail and that the net happiness will be positive, so we should not go around trying to blender brains.
As for the general issue of terrible human decisions being incentivized by these things… humans are capable of using all sorts of rationalizations to justify terrible decisions, and so, just the possibility that some people will not do due diligence with an idea and instead abuse it to justify their evil, should not be reason to abandon the idea by itself.
For instance, the possibility of living an indefinite lifespan is likely to dramatically alter people’s behaviour, including making them more risk-averse and long term thinking. This is not necessarily a bad thing, but it could lead to a reduction in people making necessary sacrifices for the good. These things are also, generally notoriously difficult to predict. Ask a medieval peasant what the effects of machines that could farm vast swaths of land would be on the economy and their livelihood and you’d probably get a very parochially minded answer.
Thank you for the thoughtful response! I’m not convinced that your assertion successfully breaks the link between effective altruism and the blender.
Is your argument consistent with making the following statement when discussing the inpending age of em?
If your mind is uploaded, a future version of you will likely subjectively experience hell. Some other version of you may also subjectively experience heaven. Many people, copies of you split off at various points, will carry all the memories of your human life’ If you feel like your brain is in a blender trying to conceive of this, you may want to put it into an actual blender before someone with temporal power and an uploading machine decides to define your eternity for you.
Well, if we’re implying that time travellers could go back and invisibly copy you at any point in time and then upload you to whatever simulation they feel inclined towards… I don’t see how blendering yourself now will prevent them from just going to the moment before that and copying that version of you.
So, reality is that blendering yourself achieves only one thing, which is to prevent the future possible yous from existing. Personally I think that does a disservice to future you. That can similarly be expanded to others. We cannot conceivably prevent copying and mind uploading of anyone by super advanced time travellers. Ultimately that is outside of our locus of control and therefore not worth worrying about.
What is more pressing I think are the questions of how we are practically acting to improve the positive conscious experiences of existing and potentially existing sentient beings, and encouraging the general direction towards heaven-like simulation, and discouraging sadistic hell-like simulation. These may not be preventable, but our actions in the present should have outsized impact on the trillions of descendents of humanity that will likely be our legacy to the stars. Whatever we can do then to encourage altruism and discourage sadism in humanity now, may very well determine the ratios of heaven to hell simulations that those aforementioned time travellers may one day decide to throw together.
Time traveling super-jerks are not in my threat model. They would sure be terrible, bu as you point out, there is no obvious solution, though fortunately time travel does not look to be nearly as close technologically as uploading does. The definition of temporal I am using is as follows:
“relating to worldly as opposed to spiritual affairs; secular.” I believe the word is appropriate in context, as traditionally, eternity is a spiritual matter and does not require actual concrete planning. I assert that if uploading becomes available within a generation, the odds of some human or organization doing something utterly terrible to the uploaded are high not low. There are plenty of recent examples of bad behavior by instituions that are around today and likely to persist.
This actually reminds me of an argument I had with some Negative-Leaning Utilitarians on the old Felicifia forums. Basically, a common concern for them was how r-selected species tend to appear to suffer way more than be happy, generally speaking, and that this can imply that was should try to reduce the suffering by eliminating those species or at least avoiding the expansion of life generally to other planets.
I likened this line of reasoning to the idea that we should Nuke The Rainforest.
Personally I think a similar counterargument to that argument applies here as well. Translated into your thought experiment, it would be In essence, that while it is true that some percentage of minds will probably end up being tortured by sadists, this is likely to be outweighed by the sheer number of minds that are even more likely to be uploaded into some kind of utopian paradise. Given that truly psychopathic sadism is actually quite rare in the general population, one would expect a very similar ratio of simulations. In the long run, the optimistic view is that decency will prevail and that the net happiness will be positive, so we should not go around trying to blender brains.
As for the general issue of terrible human decisions being incentivized by these things… humans are capable of using all sorts of rationalizations to justify terrible decisions, and so, just the possibility that some people will not do due diligence with an idea and instead abuse it to justify their evil, should not be reason to abandon the idea by itself.
For instance, the possibility of living an indefinite lifespan is likely to dramatically alter people’s behaviour, including making them more risk-averse and long term thinking. This is not necessarily a bad thing, but it could lead to a reduction in people making necessary sacrifices for the good. These things are also, generally notoriously difficult to predict. Ask a medieval peasant what the effects of machines that could farm vast swaths of land would be on the economy and their livelihood and you’d probably get a very parochially minded answer.
Thank you for the thoughtful response! I’m not convinced that your assertion successfully breaks the link between effective altruism and the blender.
Is your argument consistent with making the following statement when discussing the inpending age of em?
If your mind is uploaded, a future version of you will likely subjectively experience hell. Some other version of you may also subjectively experience heaven. Many people, copies of you split off at various points, will carry all the memories of your human life’ If you feel like your brain is in a blender trying to conceive of this, you may want to put it into an actual blender before someone with temporal power and an uploading machine decides to define your eternity for you.
Well, if we’re implying that time travellers could go back and invisibly copy you at any point in time and then upload you to whatever simulation they feel inclined towards… I don’t see how blendering yourself now will prevent them from just going to the moment before that and copying that version of you.
So, reality is that blendering yourself achieves only one thing, which is to prevent the future possible yous from existing. Personally I think that does a disservice to future you. That can similarly be expanded to others. We cannot conceivably prevent copying and mind uploading of anyone by super advanced time travellers. Ultimately that is outside of our locus of control and therefore not worth worrying about.
What is more pressing I think are the questions of how we are practically acting to improve the positive conscious experiences of existing and potentially existing sentient beings, and encouraging the general direction towards heaven-like simulation, and discouraging sadistic hell-like simulation. These may not be preventable, but our actions in the present should have outsized impact on the trillions of descendents of humanity that will likely be our legacy to the stars. Whatever we can do then to encourage altruism and discourage sadism in humanity now, may very well determine the ratios of heaven to hell simulations that those aforementioned time travellers may one day decide to throw together.
Time traveling super-jerks are not in my threat model. They would sure be terrible, bu as you point out, there is no obvious solution, though fortunately time travel does not look to be nearly as close technologically as uploading does. The definition of temporal I am using is as follows:
“relating to worldly as opposed to spiritual affairs; secular.” I believe the word is appropriate in context, as traditionally, eternity is a spiritual matter and does not require actual concrete planning. I assert that if uploading becomes available within a generation, the odds of some human or organization doing something utterly terrible to the uploaded are high not low. There are plenty of recent examples of bad behavior by instituions that are around today and likely to persist.