Um, Bayes doesn’t give you any promises, never mind guarantees, about your satisfaction. It’s basically like classical logic—it tells you the correct way to manipulate certain kinds of statements. “Satisfaction” is nowhere near its vocabulary.
Exactly! That’s why I asked: “To what extent does [Bayes] provide humans with good advice as to how they should explicitly think about their beliefs and goals?”
We clearly do live in a world where Bayes math works. But that’s a different question from whether it represents good advice for human beings’ explicit, trained thinking about their goals.
Edit: I’ve updated the post above to make this more clear.
Exactly! That’s why I asked: “To what extent does [Bayes] provide humans with good advice as to how they should explicitly think about their beliefs and goals?”
We clearly do live in a world where Bayes math works. But that’s a different question from whether it represents good advice for human beings’ explicit, trained thinking about their goals.
Edit: I’ve updated the post above to make this more clear.