It’s not clear to me why people who accurately model the world should outperform those who follow less cognitively demanding heuristics. I’ve seen this position stated as a truism during a debate, but have never read an argument for or against it. Would someone be able to link to an argument about following non-robust shortcuts to rationality, or write a short case against that practice?
It’s not clear to me why people who accurately model the world should outperform those who follow less cognitively demanding heuristics.
If the aforementioned analysis of thinking processes finds that the advantage comes from superior allocation of bounded computational resources then that would be an interesting finding and a sufficient explanation. In some cases the alternate heuristics may be worth adopting.
Is the common-sense expectation that non-robust heuristics deliver poor results in a wider subset of possible future environments than robust heuristics not adequate?
It’s not clear to me why people who accurately model the world should outperform those who follow less cognitively demanding heuristics. I’ve seen this position stated as a truism during a debate, but have never read an argument for or against it. Would someone be able to link to an argument about following non-robust shortcuts to rationality, or write a short case against that practice?
If the aforementioned analysis of thinking processes finds that the advantage comes from superior allocation of bounded computational resources then that would be an interesting finding and a sufficient explanation. In some cases the alternate heuristics may be worth adopting.
Is the common-sense expectation that non-robust heuristics deliver poor results in a wider subset of possible future environments than robust heuristics not adequate?