This comment feels like wishful thinking to me. Like, I think our communities are broadly some of the more truth-seeking communities out there. And yet, they have flaws common to human communities, such as both 1 and 2. And yet, I want to engage with these communities, and to cooperate with them. That cooperation is made much harder if actors blithely ignore these dynamics by:
Publishing criticism that could wait
Pretend that they can continue working on that strategy doc they were working on, while there’s an important discussion centered on their organization’s moral character happening in public
I have a long experience of watching conversations about orgs evolve. I advise my colleagues to urgently reply. I don’t think this is an attempt to manipulate anyone.
This comment feels like wishful thinking to me. Like, I think our communities are broadly some of the more truth-seeking communities out there. And yet, they have flaws common to human communities, such as both 1 and 2. And yet, I want to engage with these communities, and to cooperate with them. That cooperation is made much harder if actors blithely ignore these dynamics by:
Publishing criticism that could wait
Pretend that they can continue working on that strategy doc they were working on, while there’s an important discussion centered on their organization’s moral character happening in public
I have a long experience of watching conversations about orgs evolve. I advise my colleagues to urgently reply. I don’t think this is an attempt to manipulate anyone.
What are your ideas for attenuating the anti-epistemic effects of belief lock-in and group think and information cascades?