I use the term “doomer” to refer to people with a cluster of beliefs that I associate with LW users/MIRI people, including:
high P(doom)
fast takeoffs (i.e. little impact of AI on the world before AI is powerful enough to be very dangerous)
transformative models will be highly rational agents
I don’t think I can use “AI pessimist” as an alternative, because that only really describes the first of those beliefs, and often I want to refer to the cluster as a whole.
I use the term “doomer” to refer to people with a cluster of beliefs that I associate with LW users/MIRI people, including:
high P(doom)
fast takeoffs (i.e. little impact of AI on the world before AI is powerful enough to be very dangerous)
transformative models will be highly rational agents
I don’t think I can use “AI pessimist” as an alternative, because that only really describes the first of those beliefs, and often I want to refer to the cluster as a whole.
Maybe I should say MIRI-style pessimist?
Yeah, I like “MIRI-style pessimist”. Not free of ambiguity, but much more precise and doesn’t have negative valence baked in.
I think MIRI-style pessimists is actually pretty good. There are in fact other style pessimists and it’s good to not conflate their views.