There is the argument that rationality should be seen as a meta-algorithm: you look at the particular situation, work out what specific algorithm will win, and then use that.
Having the meta-algorithm will work better (on average) than any particular algorithm, though in certain cases the time lag and limited foresight of rational methods will mean that there will usually be some specific algorithm that will beat rationality in a given context.
By rationality, what I mean is, specifically, reaching accurate beliefs about the world. If we define rationality as “whatever strategies win”, then of course rationalists are going to win; however, it might also be the case that the rational thing to do is to brainwash yourself with drugs and mental training into believing in nonexistent deities (depending on how tolerant your society is of atheists and how good of a liar you are).
I think I have a substantive point with regard to epistemic rationality actually being a meta-algorithm.
you look at the particular situation, work out what specific algorithm will win, and then use that.
I should amend this to:
“you look at the particular situation, work out that you adopting a specific algorithm will cause you to win, and then try to use that algorithm.”
So, if an epistemic rationalist forms the true belief that the way to win is to brainwash yourself with drugs and mental training into believing in nonexistent deities, and also holds the preference that he wants to win, then he is rationally forced to try his hardest to brainwash himself with drugs and mental training into believing in nonexistent deities, and therefore he will probably stand an optimal chance of winning.
The force of this point is that if, instead of being an epistemic rationalist, he had merely started out as an ardent atheist, he would not have won.
There is the argument that rationality should be seen as a meta-algorithm: you look at the particular situation, work out what specific algorithm will win, and then use that.
Having the meta-algorithm will work better (on average) than any particular algorithm, though in certain cases the time lag and limited foresight of rational methods will mean that there will usually be some specific algorithm that will beat rationality in a given context.
By rationality, what I mean is, specifically, reaching accurate beliefs about the world. If we define rationality as “whatever strategies win”, then of course rationalists are going to win; however, it might also be the case that the rational thing to do is to brainwash yourself with drugs and mental training into believing in nonexistent deities (depending on how tolerant your society is of atheists and how good of a liar you are).
I think I have a substantive point with regard to epistemic rationality actually being a meta-algorithm.
I should amend this to:
“you look at the particular situation, work out that you adopting a specific algorithm will cause you to win, and then try to use that algorithm.”
So, if an epistemic rationalist forms the true belief that the way to win is to brainwash yourself with drugs and mental training into believing in nonexistent deities, and also holds the preference that he wants to win, then he is rationally forced to try his hardest to brainwash himself with drugs and mental training into believing in nonexistent deities, and therefore he will probably stand an optimal chance of winning.
The force of this point is that if, instead of being an epistemic rationalist, he had merely started out as an ardent atheist, he would not have won.