Hmm, maybe I’m missing something basic and should just go re-read the original posts, but I’m confused by this statement:
So what we do here is say “belief set A is strictly ‘better’ if this particular observer always trusts belief set A over belief set B”, and “trust” is defined as “whatever we think belief set A believes is also what we believe”.
In this, belief set A and belief set B are analogous to A[C] and C (or some c in C), right? If so, then what’s the analogue of “trust… over”?
If we replace our beliefs with A[C]’s, then how is that us trusting it “over” c or C? It seems like it’s us trusting it, full stop (without reference to any other thing that we are trusting it more than). No?
In this, belief set A and belief set B are analogous to A[C] and C (or some c in C), right?
Yes.
If we replace our beliefs with A[C]’s, then how is that us trusting it “over” c or C? It seems like it’s us trusting it, full stop
So I only showed the case where info contains information about A[C]’s predictions, but info is allowed to contain information from A[C]andC (but not other agents). Even if it contains lots of information from C, we still need to trust A[C].
In contrast, if info contained information about A[A[C]]’s beliefs, then we would not trust A[C] over that.
Hmm, maybe I’m missing something basic and should just go re-read the original posts, but I’m confused by this statement:
In this, belief set A and belief set B are analogous to A[C] and C (or some c in C), right? If so, then what’s the analogue of “trust… over”?
If we replace our beliefs with A[C]’s, then how is that us trusting it “over” c or C? It seems like it’s us trusting it, full stop (without reference to any other thing that we are trusting it more than). No?
Yes.
So I only showed the case where info contains information about A[C]’s predictions, but info is allowed to contain information from A[C] and C (but not other agents). Even if it contains lots of information from C, we still need to trust A[C].
In contrast, if info contained information about A[A[C]]’s beliefs, then we would not trust A[C] over that.