Almost everyone who thinks he or she has higher priorities than being right actually does not have higher priorities than being right, but doesn’t place enough priority on being right to see that this is the case. This is why we should avoid the “rationalists should win” mantra—figuring out what “winning” means is at least as essential as actually winning.
That’s open to interpretation. The procedure by which you are figuring out what ” winning ” means is itself a rational pursuit, that should better be precisely targeted, with ” winning’ ” in that meta-game already fixed. You have to stop somewhere, and actually write the code.
You do indeed have to stop somewhere, but any algorithm that stops before rejecting everything that’s at least one tenth as wrong as Mormonism is broken.
That’s open to interpretation. The procedure by which you are figuring out what ” winning ” means is itself a rational pursuit, that should better be precisely targeted, with ” winning’ ” in that meta-game already fixed. You have to stop somewhere, and actually write the code.
You do indeed have to stop somewhere, but any algorithm that stops before rejecting everything that’s at least one tenth as wrong as Mormonism is broken.
Huh? The algorithm doesn’t stop, the meta-meta-goal has to be fixed at some point.