One problematic aspect is that it’s often easier to avoid motivated reasoning when the stakes are low. Even if you manage to avoid it in 95% of the cases, if th remaining 5% are there what really matters you are still overall screwed.
Good point.
Alignment theory and AGI prediction spring to mind again; there it’s not just our self-concepts at stake, but the literal fate of the world.
One problematic aspect is that it’s often easier to avoid motivated reasoning when the stakes are low. Even if you manage to avoid it in 95% of the cases, if th remaining 5% are there what really matters you are still overall screwed.
Good point.
Alignment theory and AGI prediction spring to mind again; there it’s not just our self-concepts at stake, but the literal fate of the world.