In my example, as in many others, morality/utility is necessary for reducing questions about “beliefs” to questions about decisions. (Similar to how adding payoffs to the Sleeping Beauty problem clarifies matters a lot, and how naively talking about probabilities in the Absent-Minded Driver introduces a time loop.) In your formulation I may legitimately withhold judgment about unicorns—say the question is as meaningless as asking whether integers “are” a subset of reals, or a distinct set—because it doesn’t affect my future utility either way. In my formulation you can’t wiggle out as easily.
I thought about your questions some more, and stumbled upon a perspective that makes them all meaningful—yes, even the one about defining the real numbers. You have to imagine yourself living in a sort of “Solomonoff multiverse” that runs a weighted mix of all possible programs, and act as if to maximize your expected utility over that whole multiverse. Never mind “truth” or “degrees of belief” at all! If Omega comes to you and asks whether an inaccessible region of space contains galaxies or unicorns, bravely answer “galaxies” because it wins you more cookies weighted by universe-weight—simpler programs have more of it. This seems to be the coherent position that many commenters seem to be groping toward…
In my example, as in many others, morality/utility is necessary for reducing questions about “beliefs” to questions about decisions. (Similar to how adding payoffs to the Sleeping Beauty problem clarifies matters a lot, and how naively talking about probabilities in the Absent-Minded Driver introduces a time loop.) In your formulation I may legitimately withhold judgment about unicorns—say the question is as meaningless as asking whether integers “are” a subset of reals, or a distinct set—because it doesn’t affect my future utility either way. In my formulation you can’t wiggle out as easily.
[Edited out—I need to think this over a little longer]
I thought about your questions some more, and stumbled upon a perspective that makes them all meaningful—yes, even the one about defining the real numbers. You have to imagine yourself living in a sort of “Solomonoff multiverse” that runs a weighted mix of all possible programs, and act as if to maximize your expected utility over that whole multiverse. Never mind “truth” or “degrees of belief” at all! If Omega comes to you and asks whether an inaccessible region of space contains galaxies or unicorns, bravely answer “galaxies” because it wins you more cookies weighted by universe-weight—simpler programs have more of it. This seems to be the coherent position that many commenters seem to be groping toward…