But to use rthomas2′s idea of expectation, these can just be contradictory expectations. If you take away the constraint that beliefs be functions from world states to probabilities and instead let them be relations between world states and probabilities, we eliminate the need to talk about alief/belief or belief and belief-in-belief and can just talk about having contradictory axia/priors/expectations.
I think making an alief/belief distinction is mostly interesting if you want to think in terms of belief in the technical sense used by Jaynes, Pearl, et al.. Unfortunately humans don’t actually have beliefs in this technical sense, hence the entire rationalist project.
But to use rthomas2′s idea of expectation, these can just be contradictory expectations. If you take away the constraint that beliefs be functions from world states to probabilities and instead let them be relations between world states and probabilities, we eliminate the need to talk about alief/belief or belief and belief-in-belief and can just talk about having contradictory axia/priors/expectations.
I think making an alief/belief distinction is mostly interesting if you want to think in terms of belief in the technical sense used by Jaynes, Pearl, et al.. Unfortunately humans don’t actually have beliefs in this technical sense, hence the entire rationalist project.