The point of my post is that the probabilities themselves depend on whether we consider the risk worth it. To say it another way, which flattens some of the phenomenology I’m trying to do but might get the point across, I’m saying it’s a coordination problem, and computing beliefs in a CDT way is failing to get the benefits of participating fully in the possibilities of the coordination problem.
edit: Like, if everyone thought is was worth it, then it would be executed well (maybe), so the probability would be much higher, so it is worth it. A “self-fulfilling prophecy”, from a CDT perspective.
The point of my post is that the probabilities themselves depend on whether we consider the risk worth it. To say it another way, which flattens some of the phenomenology I’m trying to do but might get the point across, I’m saying it’s a coordination problem, and computing beliefs in a CDT way is failing to get the benefits of participating fully in the possibilities of the coordination problem.
edit: Like, if everyone thought is was worth it, then it would be executed well (maybe), so the probability would be much higher, so it is worth it. A “self-fulfilling prophecy”, from a CDT perspective.