I personally strive to know as much as I can about myself, even if it ultimately means that I believe a lot of less than flattering things.
Then I try to either use this knowledge to fix the problems, or figure out workarounds in presenting myself to others.
Some people are pretty okay with you knowing bad things about yourself if you wish that they aren’t true. A lot of my closer friends are like that, so I can continue being totally honest with them. If someone isn’t okay with that, then I either preempt all complaints by saying I messed up (many people find that less offensive than evasiveness), or avoid the conversations entirely.
In extreme cases, I’d rather know something about myself and hide it (either by omission and lying) or just let other people judge me for knowing it.
One convenient thing about allowing yourself to learn inconvenient truths is that its easier to realize when you’re wrong, and should apologize. Apologies tend to work really well when you mean them, and understand why the other person is mad at you.
You could want the extra dollar. ($6 instead of $5)
You could want to feel like someone who care about others.
You could genuinely care about others.
The point of the research in the post, if I understand it, is that (many) people want 1 and 2, and often the best way to get both those things is to be ignorant of the actual effects of your behavior. In my view a rationalist should decide either that they want 1 (throwing 2 and 3 out the window) or that they want 3 (forgetting 1). Either way you can know the truth and still win.
So it’s not only strategic ignorance, but selective ignorance too. By which I mean to say it only applies highly selectively.
If you have enough knowledge about the situation to know it’s going to be 6⁄1 and 5⁄5, or 5⁄1 and 6⁄5, then that’s a pretty clear distinction. You have quite a bit of knowledge, enough to narrow it to only two situations.
But as you raised, it could be 6⁄1 & 5⁄5, or 6⁄1 & 5/1000 or 6/(.0001% increase of global existential risk) & 5/(.0001% increase of the singularity within your lifetime).
The implications of your point being, if you don’t know what’s at stake, it’s better to learn what’s at stake.
Then I guess sometimes, ---ists (as I like to refer to them) should remain purposefully ignorant, in contradiction to the maxim—if, that is, they actually care about the advantages of ignorance.
I think the article is describing some existing mechanisms rather than prescribing what a rationalist should be doing.
But rationalists should win.
Good point. How would you resolve this contradiction, then?
I personally strive to know as much as I can about myself, even if it ultimately means that I believe a lot of less than flattering things.
Then I try to either use this knowledge to fix the problems, or figure out workarounds in presenting myself to others.
Some people are pretty okay with you knowing bad things about yourself if you wish that they aren’t true. A lot of my closer friends are like that, so I can continue being totally honest with them. If someone isn’t okay with that, then I either preempt all complaints by saying I messed up (many people find that less offensive than evasiveness), or avoid the conversations entirely.
In extreme cases, I’d rather know something about myself and hide it (either by omission and lying) or just let other people judge me for knowing it.
One convenient thing about allowing yourself to learn inconvenient truths is that its easier to realize when you’re wrong, and should apologize. Apologies tend to work really well when you mean them, and understand why the other person is mad at you.
There are three things you could want:
You could want the extra dollar. ($6 instead of $5)
You could want to feel like someone who care about others.
You could genuinely care about others.
The point of the research in the post, if I understand it, is that (many) people want 1 and 2, and often the best way to get both those things is to be ignorant of the actual effects of your behavior. In my view a rationalist should decide either that they want 1 (throwing 2 and 3 out the window) or that they want 3 (forgetting 1). Either way you can know the truth and still win.
The problem with strategic ignorance is if the situation is something like 6⁄1 vs. 5/1000.
Most people care more about themselves than others, but I think that at that level most people would just choose to lose a dollar and give 999 more.
If you choose to not learn something, then you don’t know what you’re causing to happen, even if it would entirely change what you would want to do.
So it’s not only strategic ignorance, but selective ignorance too. By which I mean to say it only applies highly selectively.
If you have enough knowledge about the situation to know it’s going to be 6⁄1 and 5⁄5, or 5⁄1 and 6⁄5, then that’s a pretty clear distinction. You have quite a bit of knowledge, enough to narrow it to only two situations.
But as you raised, it could be 6⁄1 & 5⁄5, or 6⁄1 & 5/1000 or 6/(.0001% increase of global existential risk) & 5/(.0001% increase of the singularity within your lifetime).
The implications of your point being, if you don’t know what’s at stake, it’s better to learn what’s at stake.
Yeah, pretty much.
Then I guess sometimes, ---ists (as I like to refer to them) should remain purposefully ignorant, in contradiction to the maxim—if, that is, they actually care about the advantages of ignorance.