Raise your hand if you (yes you, the person reading this) will submit to 50 years of torture in order to avert “least bad” dust speck momentarily finding its way into the eyes of an unimaginably large number of people.
Why was it not written “I, Eliezer Yudkowsky, should choose to submit to 50 years of torture in place of a googolplex people getting dust specks in their eyes”?
Why restrict yourself to the comforting distance of omniscience?
Did Miyamoto Musashi ever exhort the reader to ask his sword what he should want? Why is this not a case of using a tool as an end in and of itself rather than as a means to achieve a desired end?
Has anyone ever addressed whether or not this applies to the AGI Utility Monster whose experiential capacity would presumably exceed the ~7 billion humans who should rationally subserve Its interests (whatever they may be)?
I suffer under no delusion that I’m a morally perfect individual.
You seem to believe that to identify what’s the morally correct path, one must also be willing to follow it. Morality pushes our wills towards that direction, but selfishness has its own role to play and here it pushes elsewhere.
But yes, I am willing to say that I should submit to 50 years of torture in order to save 3^^^3 people getting dust specks in their eyes. I’ll also openly admit that that I am not willing to submit to such. This is not contradictory: “should” is a moral judgment, but being willing to be moral at such high cost is another thing entirely.
I would not submit to 50 years of torture to avert a dust speck in the eyes of lots of people. I suspect I also would not submit to 50 years of torture to avert a stranger being subjected to 55 years of torture. It’s not clear to me what, if anything, I should infer from this.
That you value yourself more than a stranger. (I don’t think there’s anything wrong with that, BTW, so long as this doesn’t mean you’d defect in a PD against them.)
Sure. Sorry, what I meant was it’s not clear what I should infer from this about the relative harmfulness of 50 years of torture, 55 years of torture, and Dust Specks.
Mostly, what it seems to imply is that “would I choose A over B?” doesn’t necessarily have much to do with the harmfulness to the system as a whole of A and B.
Ready the tar and feather, but I woudn’t submit myself to even 1 year of torture to avert a stranger being tortured 50 years if no terrible social repercussions could be expected.
Why was it not written “I, Eliezer Yudkowsky, should choose to submit to 50 years of torture in place of a googolplex people getting dust specks in their eyes”?
Because then it’s clearly not the same argument anymore, and would appeal only to people who ascribe to even a narrower form of incredibly altruistic utilitarianism, who I personally suspect don’t even exist statistically speaking. Say the person chosen for torture is random, then it would make a bit more sense, but would essentially be the same argument given the ridiculously high numbers involved.
Raise your hand if you (yes you, the person reading this) will submit to 50 years of torture in order to avert “least bad” dust speck momentarily finding its way into the eyes of an unimaginably large number of people.
Why was it not written “I, Eliezer Yudkowsky, should choose to submit to 50 years of torture in place of a googolplex people getting dust specks in their eyes”?
Why restrict yourself to the comforting distance of omniscience?
Did Miyamoto Musashi ever exhort the reader to ask his sword what he should want? Why is this not a case of using a tool as an end in and of itself rather than as a means to achieve a desired end?
Are you irrational if your something to protect is yourself...from torture?
Has anyone ever addressed whether or not this applies to the AGI Utility Monster whose experiential capacity would presumably exceed the ~7 billion humans who should rationally subserve Its interests (whatever they may be)?
I suffer under no delusion that I’m a morally perfect individual.
You seem to believe that to identify what’s the morally correct path, one must also be willing to follow it. Morality pushes our wills towards that direction, but selfishness has its own role to play and here it pushes elsewhere.
But yes, I am willing to say that I should submit to 50 years of torture in order to save 3^^^3 people getting dust specks in their eyes. I’ll also openly admit that that I am not willing to submit to such. This is not contradictory: “should” is a moral judgment, but being willing to be moral at such high cost is another thing entirely.
I would not submit to 50 years of torture to avert a dust speck in the eyes of lots of people.
I suspect I also would not submit to 50 years of torture to avert a stranger being subjected to 55 years of torture.
It’s not clear to me what, if anything, I should infer from this.
That you value yourself more than a stranger. (I don’t think there’s anything wrong with that, BTW, so long as this doesn’t mean you’d defect in a PD against them.)
Sure. Sorry, what I meant was it’s not clear what I should infer from this about the relative harmfulness of 50 years of torture, 55 years of torture, and Dust Specks.
Mostly, what it seems to imply is that “would I choose A over B?” doesn’t necessarily have much to do with the harmfulness to the system as a whole of A and B.
Ready the tar and feather, but I woudn’t submit myself to even 1 year of torture to avert a stranger being tortured 50 years if no terrible social repercussions could be expected.
Yup. I suspect that’s true of the overwhelming majority of people. It’s most likely true of me.
Because then it’s clearly not the same argument anymore, and would appeal only to people who ascribe to even a narrower form of incredibly altruistic utilitarianism, who I personally suspect don’t even exist statistically speaking. Say the person chosen for torture is random, then it would make a bit more sense, but would essentially be the same argument given the ridiculously high numbers involved.