If I want to signal how much I care, I’ll stick with puppies or local soup kitchens, thank you very much. That will get me a lot more warm fuzzies—and respect—from my neighbors and colleagues than making hay about a robot apocalypse.
Humans are adaptation-executers, not fitness maximisers—and evolved in tribes of not more than 100 or so. And they are exquisitely sensitive to status. As such, they will happily work way too hard to increase their status ranking in a small group, whether it makes sense from the outside view or not. (This may or may not follow failing to increase their status ranking in more mainstream groups.)
If you want to maximize respect from a broad, nonspecific community (e.g. neighbors and colleagues), that’s a good strategy. If you want to maximize respect from a particular subculture, you could do better with a more specific strategy. For example, to impress your political allies, worry about upcoming elections. To impress members of your alumni organization, worry about the state of your sports team or the university president’s competence. To impress folks on LessWrong, worry about a robot apocalypse.
That’s a fully general argument: to impress [people who care about X], worry about [X]. But it doesn’t explain why for rationalists X equals a robot apocalypse as opposed to [something else].
My best guess is that it started because Eliezer worries about a robot apocalypse, and he’s got the highest status around here. By now, a bunch of other respected community members are also worried about FAI, so it’s about affiliating with a whole high-status group rather than imitating a single leader.
That raises the question of why people care about getting status from Less Wrong in the first place. There are many other more prominent internet communities.
Other types of apocalyptic phyg also acquire followers without being especially prominent. Basically the internet has a long tail—offering many special interest groups space to exist.
If I want to signal how much I care, I’ll stick with puppies or local soup kitchens, thank you very much. That will get me a lot more warm fuzzies—and respect—from my neighbors and colleagues than making hay about a robot apocalypse.
Humans are adaptation-executers, not fitness maximisers—and evolved in tribes of not more than 100 or so. And they are exquisitely sensitive to status. As such, they will happily work way too hard to increase their status ranking in a small group, whether it makes sense from the outside view or not. (This may or may not follow failing to increase their status ranking in more mainstream groups.)
If you want to maximize respect from a broad, nonspecific community (e.g. neighbors and colleagues), that’s a good strategy. If you want to maximize respect from a particular subculture, you could do better with a more specific strategy. For example, to impress your political allies, worry about upcoming elections. To impress members of your alumni organization, worry about the state of your sports team or the university president’s competence. To impress folks on LessWrong, worry about a robot apocalypse.
That’s a fully general argument: to impress [people who care about X], worry about [X]. But it doesn’t explain why for rationalists X equals a robot apocalypse as opposed to [something else].
My best guess is that it started because Eliezer worries about a robot apocalypse, and he’s got the highest status around here. By now, a bunch of other respected community members are also worried about FAI, so it’s about affiliating with a whole high-status group rather than imitating a single leader.
I wouldn’t have listened to EY if he weren’t originally talking about AI. I realize others’ EY origin stories may differ (e.g. HPMOR).
Much depends on who you are trying to impress. Around here, lavishing care on cute puppies won’t earn you much status or respect at all.
That raises the question of why people care about getting status from Less Wrong in the first place. There are many other more prominent internet communities.
Other types of apocalyptic phyg also acquire followers without being especially prominent. Basically the internet has a long tail—offering many special interest groups space to exist.
Yeah, but how much respect will they get you from LessWrong?