I believe that similar to conservation of expected evidence, there’s a rule of rationality saying that you shouldn’t expect your beliefs to change back and forth too much, because that means there’s a lot of uncertainty about the factual matters, and the uncertainty should bring you closer to max entropy. Can’t remember the specific formula, though.
Good point. I was actually thinking about that and forgot to mention it.
I’m not sure how to articulate this well, but my diagram and OP was mainly targeted at gears level modesl. Using the athiesm example, the worlds smartest theist might have a gears level model that is further along than mine. However, I expect that the worlds smartest atheist has a gears level model that is further along than the worlds smartest theist.
I believe that similar to conservation of expected evidence, there’s a rule of rationality saying that you shouldn’t expect your beliefs to change back and forth too much, because that means there’s a lot of uncertainty about the factual matters, and the uncertainty should bring you closer to max entropy. Can’t remember the specific formula, though.
Good point. I was actually thinking about that and forgot to mention it.
I’m not sure how to articulate this well, but my diagram and OP was mainly targeted at gears level modesl. Using the athiesm example, the worlds smartest theist might have a gears level model that is further along than mine. However, I expect that the worlds smartest atheist has a gears level model that is further along than the worlds smartest theist.