If you want to maximize respect from a broad, nonspecific community (e.g. neighbors and colleagues), that’s a good strategy. If you want to maximize respect from a particular subculture, you could do better with a more specific strategy. For example, to impress your political allies, worry about upcoming elections. To impress members of your alumni organization, worry about the state of your sports team or the university president’s competence. To impress folks on LessWrong, worry about a robot apocalypse.
That’s a fully general argument: to impress [people who care about X], worry about [X]. But it doesn’t explain why for rationalists X equals a robot apocalypse as opposed to [something else].
My best guess is that it started because Eliezer worries about a robot apocalypse, and he’s got the highest status around here. By now, a bunch of other respected community members are also worried about FAI, so it’s about affiliating with a whole high-status group rather than imitating a single leader.
If you want to maximize respect from a broad, nonspecific community (e.g. neighbors and colleagues), that’s a good strategy. If you want to maximize respect from a particular subculture, you could do better with a more specific strategy. For example, to impress your political allies, worry about upcoming elections. To impress members of your alumni organization, worry about the state of your sports team or the university president’s competence. To impress folks on LessWrong, worry about a robot apocalypse.
That’s a fully general argument: to impress [people who care about X], worry about [X]. But it doesn’t explain why for rationalists X equals a robot apocalypse as opposed to [something else].
My best guess is that it started because Eliezer worries about a robot apocalypse, and he’s got the highest status around here. By now, a bunch of other respected community members are also worried about FAI, so it’s about affiliating with a whole high-status group rather than imitating a single leader.
I wouldn’t have listened to EY if he weren’t originally talking about AI. I realize others’ EY origin stories may differ (e.g. HPMOR).