I like the concept, although I wonder if you run into some kind of contradiction if you follow it all the way to the extremes. Is the belief that “all beliefs should be updateable” itself updateable?
I’m not that worried about the self reference problem as a competing framework for linking evidence and beliefs would have to outperform it, which seems pragmatically unlikely. Unlinking evidence and beliefs can work locally, which is the theme of some decision theory ‘paradoxes’ but I see such a move as lobotomizing yourself. Sure, you can construct an adversarial universe in which omega or whoever demands I lobotomize myself in order to win/not-lose but at that point we’re in blackmail territory.
My definition of insanity includes non updatable beliefs.
Is there a term for a non-updateable belief?
I like the concept, although I wonder if you run into some kind of contradiction if you follow it all the way to the extremes. Is the belief that “all beliefs should be updateable” itself updateable?
Stuck priors is the term I’ve heard.
I’m not that worried about the self reference problem as a competing framework for linking evidence and beliefs would have to outperform it, which seems pragmatically unlikely. Unlinking evidence and beliefs can work locally, which is the theme of some decision theory ‘paradoxes’ but I see such a move as lobotomizing yourself. Sure, you can construct an adversarial universe in which omega or whoever demands I lobotomize myself in order to win/not-lose but at that point we’re in blackmail territory.
Fair, and I forgot about the term stuck prior (I think I’ve heard “trapped prior” before). Thanks!