If Eliezer Yudowsky came to your house and handed you a gun and said that he would need your help with killing some people, that there was a very good reason for doing so and he would explain on the way there, would you get in the van?
Bearing in mind, of course, that he’s convinced people to let an unFriendly AI out of the box, so once he gets you alone he’ll probably be able to convince you of just about anything.
Yeah, I’d probably get in the van. I’d be very confused by the whole thing, but given the situation, it seems likely that someone needs to get shot—it would take me a while to figure out whether that someone was assassin!Eliezer or the people he says need to die, but I’d rather be in a position where I could influence events than not be in one.
If Eliezer Yudowsky came to your house and handed you a gun and said that he would need your help with killing some people, that there was a very good reason for doing so and he would explain on the way there, would you get in the van?
Hell yeah. Just let me grab my bullet proof vest! What is our escape plan?
I would reject the offer based upon the assumption that EY should be able to find or purchase more suitable assassins, and thus I was being tested or manipulated in some ridiculous fashion.
However, it would significantly raise my estimation that those people may need to die (+>10%).
Let’s say that you’ve got military training but are currently deeply in debt and unemployed, that you know EY knows about those factors, and that inside the door of the van you spot three other people who you recognize as having similar skills and similar predicaments.
...that is so absurd that I would accept it as strong evidence that this reality is a computer simulation being tweaked for interestingness. I’d get in the car, lest I disappear for being too boring.
“What would you do if you were a completely different person?”
The me-that-is-not-me would accept the offer, based upon the evidence that three others of a similar cluster in person-space also agreed, are recognized by the me-that-is-not-me, making it likely that they have worked together previously on such extra-judicial excursions, and that the me-that-is-not-me apparently has very poor decision-making capabilities, at least to the point of the inability to find decent employment, to avoid debt, or to avoid the military.
The point is, if you were asked to do something obviously immoral, but that could conceivably be justified, and that nobody else could do for you. Maybe some atrocity related to your job.
I do not consider such hypotheticals useful.
Me neither, honestly, but it’s popular enough around here I thought I’d give it a shot.
“Obviously immoral” and “conceivably justifiable” are mutually exclusive by my definitions. I would plug the act into my standard moral function, which apparently answers the question “is there a single point of moral failure” with “no,” at least for me.
What I mean is, something which would under normal circumstances be bad, but given very specific conditions would be the best way to prevent something even worse, and further, that demonstrating those conditions would be difficult.
If Eliezer Yudowsky came to your house and handed you a gun and said that he would need your help with killing some people, that there was a very good reason for doing so and he would explain on the way there, would you get in the van?
Bearing in mind, of course, that he’s convinced people to let an unFriendly AI out of the box, so once he gets you alone he’ll probably be able to convince you of just about anything.
Yeah, I’d probably get in the van. I’d be very confused by the whole thing, but given the situation, it seems likely that someone needs to get shot—it would take me a while to figure out whether that someone was assassin!Eliezer or the people he says need to die, but I’d rather be in a position where I could influence events than not be in one.
Refusing to kill is influencing events. I wouldn’t get in the van, do your crazy shit without me.
Hell yeah. Just let me grab my bullet proof vest! What is our escape plan?
I would reject the offer based upon the assumption that EY should be able to find or purchase more suitable assassins, and thus I was being tested or manipulated in some ridiculous fashion.
However, it would significantly raise my estimation that those people may need to die (+>10%).
Let’s say that you’ve got military training but are currently deeply in debt and unemployed, that you know EY knows about those factors, and that inside the door of the van you spot three other people who you recognize as having similar skills and similar predicaments.
...that is so absurd that I would accept it as strong evidence that this reality is a computer simulation being tweaked for interestingness. I’d get in the car, lest I disappear for being too boring.
Something like this, then.
Ping me after the Singularity, we’ll produce the SIAI Hit Squad video game.
More likely the situation would turn out much more mundane. And with more rooftop chases.
A crack commando unit sent to prison by a military court for a crime they didn’t commit?
“What would you do if you were a completely different person?”
The me-that-is-not-me would accept the offer, based upon the evidence that three others of a similar cluster in person-space also agreed, are recognized by the me-that-is-not-me, making it likely that they have worked together previously on such extra-judicial excursions, and that the me-that-is-not-me apparently has very poor decision-making capabilities, at least to the point of the inability to find decent employment, to avoid debt, or to avoid the military.
I do not consider such hypotheticals useful.
The point is, if you were asked to do something obviously immoral, but that could conceivably be justified, and that nobody else could do for you. Maybe some atrocity related to your job.
Me neither, honestly, but it’s popular enough around here I thought I’d give it a shot.
“Obviously immoral” and “conceivably justifiable” are mutually exclusive by my definitions. I would plug the act into my standard moral function, which apparently answers the question “is there a single point of moral failure” with “no,” at least for me.
What I mean is, something which would under normal circumstances be bad, but given very specific conditions would be the best way to prevent something even worse, and further, that demonstrating those conditions would be difficult.
Yes, that’s what I understood it to mean, and I view it as a trolley problem with error bars and “leadership influence” in the form of being from EY.
Who is “Eliezer Yudowsky?”
/snark