I wouldn’t press the button, though I had to think a bit longer about the “erase from memory” part.
It reminds me of what Eliezer often says about Friendly AI: “If you offered Gandhi a pill that would make gandhi a murderer, gandhi would refuse to take it.”
I would also refuse to do it even if my memory could be erased. Somehow, I don’t feel it’s really relevant, because when I’m considering wether to do it or not, I’m not even thinking about any guilt I might feel, I’m mostly repulsed by torture in general and imagining myself in the place of the person to be tortured.
I don’t think I would have any particular problem with murder for an adequate reason, and I wouldn’t take a “murder pill”. A stupid illustration—though I don’t remember seeing this phrase before and I’ve been following OB from the first post.
I wouldn’t press the button, though I had to think a bit longer about the “erase from memory” part.
It reminds me of what Eliezer often says about Friendly AI: “If you offered Gandhi a pill that would make gandhi a murderer, gandhi would refuse to take it.”
I would also refuse to do it even if my memory could be erased. Somehow, I don’t feel it’s really relevant, because when I’m considering wether to do it or not, I’m not even thinking about any guilt I might feel, I’m mostly repulsed by torture in general and imagining myself in the place of the person to be tortured.
I don’t think I would have any particular problem with murder for an adequate reason, and I wouldn’t take a “murder pill”. A stupid illustration—though I don’t remember seeing this phrase before and I’ve been following OB from the first post.
Example: X wouldn’t Y.
Rejoinder: Z, which is unlike X in relevant ways, would also not Y.
...huh?
More like: Z, which you could expect to be less bothered by Y than X, also would not Y.
A quick Google search reveals the Gandhi phrase on Eliezer’s website:
http://yudkowsky.net/singularity
But I think I saw it in at least one of his papers too.