I’m not claiming that you should believe this, I’m merely providing you the true information that I believe it.
Something feels off to me about this notion of “a belief about the object level that other people aren’t expected to share” from an Aumann’s Agreement Theorem point of view—the beliefs of other rational agents are, in fact, enormous updates about the world! Of course Aumannian conversations happen exceedingly rarely outside of environments with tight verifiable feedback loops about correctness, so in the real world maybe something like these norms is needed, but the part where “agreeing to disagree” gets flagged as extremely not what ideal agents would be doing seems important to me.
Something feels off to me about this notion of “a belief about the object level that other people aren’t expected to share” from an Aumann’s Agreement Theorem point of view—the beliefs of other rational agents are, in fact, enormous updates about the world! Of course Aumannian conversations happen exceedingly rarely outside of environments with tight verifiable feedback loops about correctness, so in the real world maybe something like these norms is needed, but the part where “agreeing to disagree” gets flagged as extremely not what ideal agents would be doing seems important to me.