So it’s not only strategic ignorance, but selective ignorance too. By which I mean to say it only applies highly selectively.
If you have enough knowledge about the situation to know it’s going to be 6⁄1 and 5⁄5, or 5⁄1 and 6⁄5, then that’s a pretty clear distinction. You have quite a bit of knowledge, enough to narrow it to only two situations.
But as you raised, it could be 6⁄1 & 5⁄5, or 6⁄1 & 5/1000 or 6/(.0001% increase of global existential risk) & 5/(.0001% increase of the singularity within your lifetime).
The implications of your point being, if you don’t know what’s at stake, it’s better to learn what’s at stake.
The problem with strategic ignorance is if the situation is something like 6⁄1 vs. 5/1000.
Most people care more about themselves than others, but I think that at that level most people would just choose to lose a dollar and give 999 more.
If you choose to not learn something, then you don’t know what you’re causing to happen, even if it would entirely change what you would want to do.
So it’s not only strategic ignorance, but selective ignorance too. By which I mean to say it only applies highly selectively.
If you have enough knowledge about the situation to know it’s going to be 6⁄1 and 5⁄5, or 5⁄1 and 6⁄5, then that’s a pretty clear distinction. You have quite a bit of knowledge, enough to narrow it to only two situations.
But as you raised, it could be 6⁄1 & 5⁄5, or 6⁄1 & 5/1000 or 6/(.0001% increase of global existential risk) & 5/(.0001% increase of the singularity within your lifetime).
The implications of your point being, if you don’t know what’s at stake, it’s better to learn what’s at stake.
Yeah, pretty much.