related idea: when could seeking to improve our maps could we lose instrumental rationality?
I have an example of this. Was at a meeting at work last year, where a research group was proposing (to get money) for a study to provide genetic “counseling” to poor communities in Harlem. One person raised the objection: (paraphrasing) we can teach people as much as we can about real genetic risk factors for diseases, but without serious education, most people probably won’t get it.
They’ll hear “genes, risk factor” and probably just overestimate their actual risk and lead to poor decision making based on misunderstanding information. In striving to improve epistemic rationality we could impair true instrumental “winning.”
So in this case, being completely naive leads to better outcomes than having more, if incomplete knowledge.
Not sure what the outcome of the actual study was.
related idea: when could seeking to improve our maps could we lose instrumental rationality?
I have an example of this. Was at a meeting at work last year, where a research group was proposing (to get money) for a study to provide genetic “counseling” to poor communities in Harlem. One person raised the objection: (paraphrasing) we can teach people as much as we can about real genetic risk factors for diseases, but without serious education, most people probably won’t get it.
They’ll hear “genes, risk factor” and probably just overestimate their actual risk and lead to poor decision making based on misunderstanding information. In striving to improve epistemic rationality we could impair true instrumental “winning.”
So in this case, being completely naive leads to better outcomes than having more, if incomplete knowledge.
Not sure what the outcome of the actual study was.
You might find this post interesting; I think it touches on the issue you bring up, but from another direction.