If I’m understanding your comment correctly, you’re suggesting that the threshold between “belief in belief” and “failure to internalize” in this case has to do with the willingness to make predictions/bets—e.g., if I’m willing to give someone a large sum of money in exchange for a reliable commitment to give me a much much larger sum of money after I am restored from cryonic suspension, then we say I have a “genuine belief” in cryonics and not a mere “belief in belief”, although I might still failed to have an “internalized belief”… is that right?
Having clarified that: can you say more about why the distinction (between belief in belief in cryonics, and genuine but non-internalized belief in cryonics) is important in this context? That is… why do you bring it up?
Well, if someone had belief in belief in cryonics, they might say that cryonics would preserve people and allow them to be brought back in the future, but every time they have to make a falsifiable prediction based on it, they’ll find an excuse to avoid backing that prediction. If they’re willing to make falsifiable predictions based on their belief, but go on behaving the same as if they believed they only had an expectation of living several decades, they probably only have a far mode belief in cryonics.
It takes different input to bring a far mode belief into near mode than to convince someone to actually believe something they formerly only had belief in belief in.
It seems like the major difference here is compartmentalization. Someone who only takes a belief into account when it is explicitly called to their attention has an belief that is not internalized.
Sounds right.
Cool.
Having clarified that: can you say more about why the distinction (between belief in belief in cryonics, and genuine but non-internalized belief in cryonics) is important in this context? That is… why do you bring it up?
Well, if someone had belief in belief in cryonics, they might say that cryonics would preserve people and allow them to be brought back in the future, but every time they have to make a falsifiable prediction based on it, they’ll find an excuse to avoid backing that prediction. If they’re willing to make falsifiable predictions based on their belief, but go on behaving the same as if they believed they only had an expectation of living several decades, they probably only have a far mode belief in cryonics.
It takes different input to bring a far mode belief into near mode than to convince someone to actually believe something they formerly only had belief in belief in.
It seems like the major difference here is compartmentalization. Someone who only takes a belief into account when it is explicitly called to their attention has an belief that is not internalized.