Thank you for your kind words. I don’t know if I’m nitpicking here, but I don’t see tribalism as a single point of moral failure, I see it as leading to group structures with a single point of moral failure.
As for the modern name, while for the country I grew up in, Greece, religion and nationality are practically the same (the religion is tellingly named greek orthodox christianity), I use tribalism for any situation where people may put their alegiance in a group. Nations are perhaps the nearest modern equivalent to the tribe concept, but cities, neighborhoods, clans/families, religions, football clubs, political parties, and nearly anything else you can organise people around can take over the psychological function of a tribe.
If Eliezer Yudowsky came to your house and handed you a gun and said that he would need your help with killing some people, that there was a very good reason for doing so and he would explain on the way there, would you get in the van?
Bearing in mind, of course, that he’s convinced people to let an unFriendly AI out of the box, so once he gets you alone he’ll probably be able to convince you of just about anything.
Yeah, I’d probably get in the van. I’d be very confused by the whole thing, but given the situation, it seems likely that someone needs to get shot—it would take me a while to figure out whether that someone was assassin!Eliezer or the people he says need to die, but I’d rather be in a position where I could influence events than not be in one.
If Eliezer Yudowsky came to your house and handed you a gun and said that he would need your help with killing some people, that there was a very good reason for doing so and he would explain on the way there, would you get in the van?
Hell yeah. Just let me grab my bullet proof vest! What is our escape plan?
I would reject the offer based upon the assumption that EY should be able to find or purchase more suitable assassins, and thus I was being tested or manipulated in some ridiculous fashion.
However, it would significantly raise my estimation that those people may need to die (+>10%).
Let’s say that you’ve got military training but are currently deeply in debt and unemployed, that you know EY knows about those factors, and that inside the door of the van you spot three other people who you recognize as having similar skills and similar predicaments.
...that is so absurd that I would accept it as strong evidence that this reality is a computer simulation being tweaked for interestingness. I’d get in the car, lest I disappear for being too boring.
“What would you do if you were a completely different person?”
The me-that-is-not-me would accept the offer, based upon the evidence that three others of a similar cluster in person-space also agreed, are recognized by the me-that-is-not-me, making it likely that they have worked together previously on such extra-judicial excursions, and that the me-that-is-not-me apparently has very poor decision-making capabilities, at least to the point of the inability to find decent employment, to avoid debt, or to avoid the military.
The point is, if you were asked to do something obviously immoral, but that could conceivably be justified, and that nobody else could do for you. Maybe some atrocity related to your job.
I do not consider such hypotheticals useful.
Me neither, honestly, but it’s popular enough around here I thought I’d give it a shot.
“Obviously immoral” and “conceivably justifiable” are mutually exclusive by my definitions. I would plug the act into my standard moral function, which apparently answers the question “is there a single point of moral failure” with “no,” at least for me.
What I mean is, something which would under normal circumstances be bad, but given very specific conditions would be the best way to prevent something even worse, and further, that demonstrating those conditions would be difficult.
Perhaps the real single point of failure is identifying as anything. I might identify as a rationalist when speaking to others, but in my own mind I think such an identification would be unhelpful (and could be harmful).
Deeper, more controversial answer: The single point of failure is words, or rather the fact that social convenience strongly impels us to make a medium designed for communication double as a medium for our own thoughts. The phenomenon of self-identifying as an “X-ist” (even if initially only for social convenience) is just one of the unfortunate consequences.
Thank you for your kind words. I don’t know if I’m nitpicking here, but I don’t see tribalism as a single point of moral failure, I see it as leading to group structures with a single point of moral failure.
As for the modern name, while for the country I grew up in, Greece, religion and nationality are practically the same (the religion is tellingly named greek orthodox christianity), I use tribalism for any situation where people may put their alegiance in a group. Nations are perhaps the nearest modern equivalent to the tribe concept, but cities, neighborhoods, clans/families, religions, football clubs, political parties, and nearly anything else you can organise people around can take over the psychological function of a tribe.
Like blogs. I wonder if people feel that LessWrong forms a rationalist tribe, and if so, whether it contains a single point of moral failure.
If Eliezer Yudowsky came to your house and handed you a gun and said that he would need your help with killing some people, that there was a very good reason for doing so and he would explain on the way there, would you get in the van?
Bearing in mind, of course, that he’s convinced people to let an unFriendly AI out of the box, so once he gets you alone he’ll probably be able to convince you of just about anything.
Yeah, I’d probably get in the van. I’d be very confused by the whole thing, but given the situation, it seems likely that someone needs to get shot—it would take me a while to figure out whether that someone was assassin!Eliezer or the people he says need to die, but I’d rather be in a position where I could influence events than not be in one.
Refusing to kill is influencing events. I wouldn’t get in the van, do your crazy shit without me.
Hell yeah. Just let me grab my bullet proof vest! What is our escape plan?
I would reject the offer based upon the assumption that EY should be able to find or purchase more suitable assassins, and thus I was being tested or manipulated in some ridiculous fashion.
However, it would significantly raise my estimation that those people may need to die (+>10%).
Let’s say that you’ve got military training but are currently deeply in debt and unemployed, that you know EY knows about those factors, and that inside the door of the van you spot three other people who you recognize as having similar skills and similar predicaments.
...that is so absurd that I would accept it as strong evidence that this reality is a computer simulation being tweaked for interestingness. I’d get in the car, lest I disappear for being too boring.
Something like this, then.
Ping me after the Singularity, we’ll produce the SIAI Hit Squad video game.
More likely the situation would turn out much more mundane. And with more rooftop chases.
A crack commando unit sent to prison by a military court for a crime they didn’t commit?
“What would you do if you were a completely different person?”
The me-that-is-not-me would accept the offer, based upon the evidence that three others of a similar cluster in person-space also agreed, are recognized by the me-that-is-not-me, making it likely that they have worked together previously on such extra-judicial excursions, and that the me-that-is-not-me apparently has very poor decision-making capabilities, at least to the point of the inability to find decent employment, to avoid debt, or to avoid the military.
I do not consider such hypotheticals useful.
The point is, if you were asked to do something obviously immoral, but that could conceivably be justified, and that nobody else could do for you. Maybe some atrocity related to your job.
Me neither, honestly, but it’s popular enough around here I thought I’d give it a shot.
“Obviously immoral” and “conceivably justifiable” are mutually exclusive by my definitions. I would plug the act into my standard moral function, which apparently answers the question “is there a single point of moral failure” with “no,” at least for me.
What I mean is, something which would under normal circumstances be bad, but given very specific conditions would be the best way to prevent something even worse, and further, that demonstrating those conditions would be difficult.
Yes, that’s what I understood it to mean, and I view it as a trolley problem with error bars and “leadership influence” in the form of being from EY.
Who is “Eliezer Yudowsky?”
/snark
Perhaps the real single point of failure is identifying as anything. I might identify as a rationalist when speaking to others, but in my own mind I think such an identification would be unhelpful (and could be harmful).
Deeper, more controversial answer: The single point of failure is words, or rather the fact that social convenience strongly impels us to make a medium designed for communication double as a medium for our own thoughts. The phenomenon of self-identifying as an “X-ist” (even if initially only for social convenience) is just one of the unfortunate consequences.
Related: Beware identity.