That’s a fully general argument: to impress [people who care about X], worry about [X]. But it doesn’t explain why for rationalists X equals a robot apocalypse as opposed to [something else].
My best guess is that it started because Eliezer worries about a robot apocalypse, and he’s got the highest status around here. By now, a bunch of other respected community members are also worried about FAI, so it’s about affiliating with a whole high-status group rather than imitating a single leader.
That’s a fully general argument: to impress [people who care about X], worry about [X]. But it doesn’t explain why for rationalists X equals a robot apocalypse as opposed to [something else].
My best guess is that it started because Eliezer worries about a robot apocalypse, and he’s got the highest status around here. By now, a bunch of other respected community members are also worried about FAI, so it’s about affiliating with a whole high-status group rather than imitating a single leader.
I wouldn’t have listened to EY if he weren’t originally talking about AI. I realize others’ EY origin stories may differ (e.g. HPMOR).