Our disagreement here may be that I don’t think “group rationality” is a thing.
To be clear, my intended point is definitely that integrity and accountability are a core part of individual rationality, not just group rationality. In particular, getting good at designing the incentives that you are under strikes me as likely a necessary step to actually be able to have accurate beliefs about the world. In some sense this requires thinking about other people, but it is with the aim of making your own models more accurate.
Oh, that’s interesting, and not where I thought you were going. Knowing you mean it about your and my biases due to incentives, and that understanding and choosing the situations that have incentive structures that allow rational thinking for myself rather than general other-person incentives helps a lot. I think I can fully support that framing.
To be clear, my intended point is definitely that integrity and accountability are a core part of individual rationality, not just group rationality. In particular, getting good at designing the incentives that you are under strikes me as likely a necessary step to actually be able to have accurate beliefs about the world. In some sense this requires thinking about other people, but it is with the aim of making your own models more accurate.
Oh, that’s interesting, and not where I thought you were going. Knowing you mean it about your and my biases due to incentives, and that understanding and choosing the situations that have incentive structures that allow rational thinking for myself rather than general other-person incentives helps a lot. I think I can fully support that framing.