Moral Maze dynamics push corporations not just to pursue profit at all other costs, but also to be extremely myopic. As long as the death doesn’t happen before the end of the quarter, the big labs, being immoral mazes, have no reason to give a shit about x-risk. Of course, every individual member of a big lab has reason to care, but the organization as an egregore does not (and so there is strong selection pressure for these organizations to have people that have low P(doom) and/or don’t (think they) value the future lives of themselves and others).
there is strong selection pressure for these organizations to have people that have low P(doom) and/or don’t (think they) value the future lives of themselves and others
This is an important thing I didn’t realize. When I try to imagine the people who make decisions in organizations, my intuitive model would be somewhere between “normal people” and “greedy psychopaths”, depending on my mood, and how bad the organization seems.
But in addition to this, there is the systematic shift towards “people who genuinely believe things that happen to be convenient for the organization’s mission”, as a kind of cognitive bias on group scale. Not average people with average beliefs. Not psychopaths who prioritize profit above everything. But people who were selected from the pool of average by having their genuine beliefs aligned with what happens to be profitable in given organization.
I was already aware of similar things happening in “think tanks”, where producing beliefs is the entire point of the organization. Their collective beliefs are obviously biased, not primarily because the individuals are biased, but because the individuals were selected for having their genuine beliefs already extreme in a certain direction.
But I didn’t realize that the same is kinda true for every organization, because the implied belief is “this organization’s mission is good (or at least neutral, if I am merely doing it for money)”.
Would this mean that epistemically healthiest organizations are those whose employees don’t give a fuck about the mission and only do it for money?
Moral Maze dynamics push corporations not just to pursue profit at all other costs, but also to be extremely myopic. As long as the death doesn’t happen before the end of the quarter, the big labs, being immoral mazes, have no reason to give a shit about x-risk. Of course, every individual member of a big lab has reason to care, but the organization as an egregore does not (and so there is strong selection pressure for these organizations to have people that have low P(doom) and/or don’t (think they) value the future lives of themselves and others).
This is an important thing I didn’t realize. When I try to imagine the people who make decisions in organizations, my intuitive model would be somewhere between “normal people” and “greedy psychopaths”, depending on my mood, and how bad the organization seems.
But in addition to this, there is the systematic shift towards “people who genuinely believe things that happen to be convenient for the organization’s mission”, as a kind of cognitive bias on group scale. Not average people with average beliefs. Not psychopaths who prioritize profit above everything. But people who were selected from the pool of average by having their genuine beliefs aligned with what happens to be profitable in given organization.
I was already aware of similar things happening in “think tanks”, where producing beliefs is the entire point of the organization. Their collective beliefs are obviously biased, not primarily because the individuals are biased, but because the individuals were selected for having their genuine beliefs already extreme in a certain direction.
But I didn’t realize that the same is kinda true for every organization, because the implied belief is “this organization’s mission is good (or at least neutral, if I am merely doing it for money)”.
Would this mean that epistemically healthiest organizations are those whose employees don’t give a fuck about the mission and only do it for money?