Almost everyone who thinks he or she has higher priorities than being right actually does not have higher priorities than being right, but doesn’t place enough priority on being right to see that this is the case. This is why we should avoid the “rationalists should win” mantra—figuring out what “winning” means is at least as essential as actually winning.
I reject out of hand the idea that she should deconvert in the closet and systematically lie to everyone she knows.
Rejecting options out of hand is bad, especially when the alternatives suck.
Almost everyone who thinks he or she has higher priorities than being right actually does not have higher priorities than being right, but doesn’t place enough priority on being right to see that this is the case.
After parsing this, I think you are saying:
Many people who think they have higher priorities than being right
Do not have higher priorities than being right
But do not know they do not have higher priorities than being right
Because they do not have a high enough priorities with regards to being right
So, replacing “priorities” with “X” and “being right” with “Y” we get this:
Many people who think they have higher X than Y
Do not have higher X than Y
But do not know they do not have higher X than Y
Because they do not have a high enough X with regards to Y
Which is a very mean and uncharitable way of saying I do not know what you mean. I think my difficulty is that I rank priorities against themselves. To me, Priority of 55 makes no sense. Fifty-fifth Priority does. Bumping priority up means replacing a higher rank with a lower rank. If something has no higher priority is is First Priority. With these definitions, your statement makes no sense because (2) and (4) are incompatible.
OK, I can see how that was unclear, but I stand by the statement. Figuring out what one’s true goals are is itself a problem that one can apply rationality to. Many people think applying rationality doesn’t help achieve their goals well enough to be worth the costs. But they’re wrong: rationality helps achieve their true goals well enough to be worth the costs. If they applied rationality enough, they’d find out that their true goals aren’t what they thought they were, and conclude that applying rationality was indeed worth it.
An irrational person cannot reliably assess the cost of being irrational. A rational person can. People who have chosen rationality almost always agree choosing rationality was worth it.
Red and blue box, one of them contains a diamond. Wednesday asks, “how would this “rationality” thing help me get to the red box, which contains the diamond?” But the diamond is in the blue box.
An irrational person cannot reliably assess the cost of being irrational. A rational person can.
Yes, a fully rational person is better able to assess the relative costs of being irrational vs. rational. But this knowledge won’t help them much if it turns out that the costs of being irrational were lower after all.
Yeah, that makes more sense. I think there is a danger in telling someone they do not know what they really want or what their true goals are, but I understand your point and agree.
Almost everyone who thinks he or she has higher priorities than being right actually does not have higher priorities than being right, but doesn’t place enough priority on being right to see that this is the case. This is why we should avoid the “rationalists should win” mantra—figuring out what “winning” means is at least as essential as actually winning.
That’s open to interpretation. The procedure by which you are figuring out what ” winning ” means is itself a rational pursuit, that should better be precisely targeted, with ” winning’ ” in that meta-game already fixed. You have to stop somewhere, and actually write the code.
You do indeed have to stop somewhere, but any algorithm that stops before rejecting everything that’s at least one tenth as wrong as Mormonism is broken.
Almost everyone who thinks he or she has higher priorities than being right actually does not have higher priorities than being right, but doesn’t place enough priority on being right to see that this is the case.
Can you help me disentangle what you mean by this? There seems to be some equivocation.
Rejecting options out of hand is bad, especially when the alternatives suck.
I rejected that option for ethical reasons. The alternatives do suck, but “carry on believing as always” and “deconvert, then tell an uncomfortable truth” are at least not unethical.
The alternatives do suck, but “carry on believing as always” and “deconvert, then tell an uncomfortable truth” are at least not unethical.
Choosing to believe falsely and then speaking honestly is at least as unethical as choosing to believe truly and then lying. The former amounts to lying and then committing the further ethical crime of believing one’s own lies.
Almost everyone who thinks he or she has higher priorities than being right actually does not have higher priorities than being right, but doesn’t place enough priority on being right to see that this is the case. This is why we should avoid the “rationalists should win” mantra—figuring out what “winning” means is at least as essential as actually winning.
Rejecting options out of hand is bad, especially when the alternatives suck.
After parsing this, I think you are saying:
Many people who think they have higher priorities than being right
Do not have higher priorities than being right
But do not know they do not have higher priorities than being right
Because they do not have a high enough priorities with regards to being right
So, replacing “priorities” with “X” and “being right” with “Y” we get this:
Many people who think they have higher X than Y
Do not have higher X than Y
But do not know they do not have higher X than Y
Because they do not have a high enough X with regards to Y
Which is a very mean and uncharitable way of saying I do not know what you mean. I think my difficulty is that I rank priorities against themselves. To me, Priority of 55 makes no sense. Fifty-fifth Priority does. Bumping priority up means replacing a higher rank with a lower rank. If something has no higher priority is is First Priority. With these definitions, your statement makes no sense because (2) and (4) are incompatible.
OK, I can see how that was unclear, but I stand by the statement. Figuring out what one’s true goals are is itself a problem that one can apply rationality to. Many people think applying rationality doesn’t help achieve their goals well enough to be worth the costs. But they’re wrong: rationality helps achieve their true goals well enough to be worth the costs. If they applied rationality enough, they’d find out that their true goals aren’t what they thought they were, and conclude that applying rationality was indeed worth it.
An irrational person cannot reliably assess the cost of being irrational. A rational person can. People who have chosen rationality almost always agree choosing rationality was worth it.
Red and blue box, one of them contains a diamond. Wednesday asks, “how would this “rationality” thing help me get to the red box, which contains the diamond?” But the diamond is in the blue box.
Yes, a fully rational person is better able to assess the relative costs of being irrational vs. rational. But this knowledge won’t help them much if it turns out that the costs of being irrational were lower after all.
Yeah, that makes more sense. I think there is a danger in telling someone they do not know what they really want or what their true goals are, but I understand your point and agree.
I don’t think the danger is in saying that another doesn’t know their true goals so much as in thinking that you do know them.
That’s open to interpretation. The procedure by which you are figuring out what ” winning ” means is itself a rational pursuit, that should better be precisely targeted, with ” winning’ ” in that meta-game already fixed. You have to stop somewhere, and actually write the code.
You do indeed have to stop somewhere, but any algorithm that stops before rejecting everything that’s at least one tenth as wrong as Mormonism is broken.
Huh? The algorithm doesn’t stop, the meta-meta-goal has to be fixed at some point.
Can you help me disentangle what you mean by this? There seems to be some equivocation.
I rejected that option for ethical reasons. The alternatives do suck, but “carry on believing as always” and “deconvert, then tell an uncomfortable truth” are at least not unethical.
For clarification, see my reply to MrHen.
Choosing to believe falsely and then speaking honestly is at least as unethical as choosing to believe truly and then lying. The former amounts to lying and then committing the further ethical crime of believing one’s own lies.