The charitable interpretation of Eliezer’s position here is that he doesn’t want to tie the word ‘rational’ to any particular methodology. He wants to tie it to “winning”. So that, if someone comes up with a better decision theory (for example), he wants to evaluate the ‘rationality’ of that theory by the criterion of whether it wins, rather than by the criterion of whether it matches the orthodox methodology.
I don’t have a problem with that definition. It seems the most useful one. It is just that being maximally rational doesn’t make you win all the time. It maximizes winning. Or expected winning. Something like that. It doesn’t make you guess lotto numbers perfectly or generally act like you’re save scumming the universe.
I don’t have a problem with that definition. It seems the most useful one. It is just that being maximally rational doesn’t make you win all the time. It maximizes winning. Or expected winning. Something like that. It doesn’t make you guess lotto numbers perfectly or generally act like you’re save scumming the universe.