Unknown, that’s a very interesting take indeed, and a good argument for Eliezer’s proposition, but it doesn’t say much about what to do if you can assume most of the 3^^^3 would ask for dust. Can you tell me what you would do purely in the context of my previous post?
If you set it to 2, two people will be tortured by an amount less the first person by 1/3^^^3 of the difference between the 50 years and a dust speck.
Of course not, this would be a no-brainer ratio for the lever to operate with. You should have said that position 2 on the lever tortures 2 people for something like 25.0001 years. That puts me in far more of a quandary. And intuition (gasp!) leads me to believe that while harm does, of course, aggregate over people, it aggregates slightly less than linearly. In this case, push the lever as far as it can go! Spread that harm as thinly as you can! [Braces self for backlash....]
Say we found the magical (subjective, natch) ratio of harm-to-people (which must exist). This ratio is plugged into the lever—harm decreases with people exactly along this line. If two people getting dusted once is equal to one person getting dusted twice, does this mean you don’t care where the lever is placed, since (harm)/(people) = k ?
Will you try to pull the lever over to 3^^^3 if there’s a significant chance the lever might get stuck somewhere in the middle?
Unknown, that’s a very interesting take indeed, and a good argument for Eliezer’s proposition, but it doesn’t say much about what to do if you can assume most of the 3^^^3 would ask for dust. Can you tell me what you would do purely in the context of my previous post?
If you set it to 2, two people will be tortured by an amount less the first person by 1/3^^^3 of the difference between the 50 years and a dust speck.
Of course not, this would be a no-brainer ratio for the lever to operate with. You should have said that position 2 on the lever tortures 2 people for something like 25.0001 years. That puts me in far more of a quandary. And intuition (gasp!) leads me to believe that while harm does, of course, aggregate over people, it aggregates slightly less than linearly. In this case, push the lever as far as it can go! Spread that harm as thinly as you can! [Braces self for backlash....]
Say we found the magical (subjective, natch) ratio of harm-to-people (which must exist). This ratio is plugged into the lever—harm decreases with people exactly along this line. If two people getting dusted once is equal to one person getting dusted twice, does this mean you don’t care where the lever is placed, since (harm)/(people) = k ?
Will you try to pull the lever over to 3^^^3 if there’s a significant chance the lever might get stuck somewhere in the middle?
I would make sure I had an oil can to hand. ;)