If we can find a problem where EDT clearly and irrevocably gives the wrong answer, we should not give it any credence
I think this is potentially an overly strong criteria for decision theories—we should probably restrict to something like the problems to a fair problem class, else we end up with no decision theory receiving any credence.
I also think “wrong answer” is doing a lot of work here. Caspar Oesterheld writes
However, there is no agreed-upon metric to compare decision theories, no way to asses even for a particular problem whether one decision theory (or its recommendation) does better than another. (This is why the CDT-versus-EDT-versus-other debate is at least partly a philosophical one.) In fact, it seems plausible that finding such a metric is “decision theory-complete” (to butcher another term with a specific meaning in computer science). By that I mean that settling on a metric is probably just as hard as settling on a decision theory and that mapping between plausible metrics and plausible decision theories is fairly easy.
I think this is potentially an overly strong criteria for decision theories—we should probably restrict to something like the problems to a fair problem class, else we end up with no decision theory receiving any credence.
Good point, I should have mentioned that in my article. (Note that XOR Blackmail is definitely a fair problem (not that you are claiming otherwise)).
I also think “wrong answer” is doing a lot of work here.
I at least in part agree here. This is why I picked XOR Blackmail, because it has such an obvious right answer. That’s an intuition, but that’s also true for some of the points made in favor of The Evidentialist’s Wager to begin with.
I think this is potentially an overly strong criteria for decision theories—we should probably restrict to something like the problems to a fair problem class, else we end up with no decision theory receiving any credence.
I also think “wrong answer” is doing a lot of work here. Caspar Oesterheld writes
Btw, thanks for your comment! I edited my post with respect to fair problems.
Good point, I should have mentioned that in my article. (Note that XOR Blackmail is definitely a fair problem (not that you are claiming otherwise)).
I at least in part agree here. This is why I picked XOR Blackmail, because it has such an obvious right answer. That’s an intuition, but that’s also true for some of the points made in favor of The Evidentialist’s Wager to begin with.