“If it ever turns out that Bayes fails—receives systematically lower rewards on some problem, relative to a superior alternative, in virtue of its mere decisions—then Bayes has to go out the window.”
This is such an important concept.
I will say this declaratively: The correct choice is to take only box two. If you disagree, check your premises.
“But it is agreed even among causal decision theorists that if you have the power to precommit yourself to take one box, in Newcomb’s Problem, then you should do so. If you can precommit yourself before Omega examines you; then you are directly causing box B to be filled.”
Is this your objection? The problem is, you don’t know if the superintelligent alien is basing anything on “precommital.” Maybe the superintelligent alien has some technology or understanding that allows him to actually see the end result of your future contemplation. Maybe he’s solved time travel and has seen what you pick.
Unless you understand not only the alien’s mode of operation but also his method, you really are just guessing at how he’ll decide what to put in box two. And your record on guesses is not as good as his.
There’s nothing mystical about it. You do it because it works. Not because you know how it works.
“If it ever turns out that Bayes fails—receives systematically lower rewards on some problem, relative to a superior alternative, in virtue of its mere decisions—then Bayes has to go out the window.”
This is such an important concept.
Yes, but like falsifiability, dangerous. This also goes for ‘rationalists win’, too.
‘We’ (Bayesians) face the Duhem-Quine thesis with a vengeance: we have often found situations where Bayes failed. And then we rescued it (we think) by either coming up with novel theses (TDT) or carefully analyzing the problem or a related problem and saying that is the real answer and so Bayes works after all (Jaynes again and again). Have we corrected ourselves or just added epicycles and special pleading? Should we just have tossed Bayes out the window at that point except in the limited areas we already proved it to be optimal or useful?
I liked the quote not because of any notion that Bayes will or should “go out the window,” but because, coming from a devout (can I use that word?) Bayesian, it’s akin to a mathematician saying that if 2+2 ceases to be 4, that equation goes out the window. I just like what this says about one’s epistemology—we don’t claim to know with dogmatic certainty, but in varying degrees of certainty, which, to bring things full circle, is what Bayes seems to be all about (at least to me, a novice).
More concisely, I like the quote because it draws a line. We can rail against the crazy strict Empiricism that denies rationality, but we won’t hold to a rationality so devoutly that it becomes faith.
because, coming from a devout (can I use that word?) Bayesian, it’s akin to a mathematician saying that if 2+2 ceases to be 4, that equation goes out the window.
Duhem-Quine is just as much a problem there; from Ludwig Wittgenstein, Remarks on the Foundations of Mathematics:
“If a contradiction were now actually found in arithmetic – that would only prove that an arithmetic with such a contradiction in it could render very good service; and it would be better for us to modify our concept of the certainty required, than to say it would really not yet have been a proper arithmetic.”
Indeed.
To generalize, when we run into skeptical arguments employing the above circularity or fundamental uncertainties, I think of Kripke:
“A skeptical solution of a philosophical problem begins… by conceding that the skeptic’s negative assertions are unanswerable. Nevertheless our ordinary practice or belief is justified because—contrary appearances notwithstanding—it need not require the justification the sceptic has shown to be untenable. And much of the value of the sceptical argument consists precisely in the fact that he has shown that an ordinary practice, if it is to be defended at all, cannot be defended in a certain way.”
Upvoted for this sentence:
“If it ever turns out that Bayes fails—receives systematically lower rewards on some problem, relative to a superior alternative, in virtue of its mere decisions—then Bayes has to go out the window.”
This is such an important concept.
I will say this declaratively: The correct choice is to take only box two. If you disagree, check your premises.
“But it is agreed even among causal decision theorists that if you have the power to precommit yourself to take one box, in Newcomb’s Problem, then you should do so. If you can precommit yourself before Omega examines you; then you are directly causing box B to be filled.”
Is this your objection? The problem is, you don’t know if the superintelligent alien is basing anything on “precommital.” Maybe the superintelligent alien has some technology or understanding that allows him to actually see the end result of your future contemplation. Maybe he’s solved time travel and has seen what you pick.
Unless you understand not only the alien’s mode of operation but also his method, you really are just guessing at how he’ll decide what to put in box two. And your record on guesses is not as good as his.
There’s nothing mystical about it. You do it because it works. Not because you know how it works.
Yes, but like falsifiability, dangerous. This also goes for ‘rationalists win’, too.
‘We’ (Bayesians) face the Duhem-Quine thesis with a vengeance: we have often found situations where Bayes failed. And then we rescued it (we think) by either coming up with novel theses (TDT) or carefully analyzing the problem or a related problem and saying that is the real answer and so Bayes works after all (Jaynes again and again). Have we corrected ourselves or just added epicycles and special pleading? Should we just have tossed Bayes out the window at that point except in the limited areas we already proved it to be optimal or useful?
This can’t really be answered.
I liked the quote not because of any notion that Bayes will or should “go out the window,” but because, coming from a devout (can I use that word?) Bayesian, it’s akin to a mathematician saying that if 2+2 ceases to be 4, that equation goes out the window. I just like what this says about one’s epistemology—we don’t claim to know with dogmatic certainty, but in varying degrees of certainty, which, to bring things full circle, is what Bayes seems to be all about (at least to me, a novice).
More concisely, I like the quote because it draws a line. We can rail against the crazy strict Empiricism that denies rationality, but we won’t hold to a rationality so devoutly that it becomes faith.
Duhem-Quine is just as much a problem there; from Ludwig Wittgenstein, Remarks on the Foundations of Mathematics:
Indeed.
To generalize, when we run into skeptical arguments employing the above circularity or fundamental uncertainties, I think of Kripke: