There are a couple of problems here. First is the usual thing forgotten on LW—costs. “More information” is worthwhile iff its benefits outweigh the costs of acquiring it. Second, your argument implies that, say, attempting to read the entire Wikipedia (or Encyclopedia Britannica if you are worried about stability) from start to finish would be a rational thing to do. Would it?
No, it isn’t. Being curious is a good heuristic for most people, because most people are in the region where information gathering is cheaper than the expected value of gathering information. I don’t think we disagree on anything concrete: I don’t claim that it’s rational in itself a priori but is a fairly good heuristic.
This is a good point about taking into account the costs. I want to cover this idea in my third post which I am still writing, but will probably be something like Principle 3 – your rationality depends on the usefulness of your internal representation of the world. My view is that truth seeking should be viewed as an optimization process. If it doesn’t allow you to become more optimal, then it is not worth it. I have a post about this here.
There are a couple of problems here. First is the usual thing forgotten on LW—costs. “More information” is worthwhile iff its benefits outweigh the costs of acquiring it. Second, your argument implies that, say, attempting to read the entire Wikipedia (or Encyclopedia Britannica if you are worried about stability) from start to finish would be a rational thing to do. Would it?
No, it isn’t. Being curious is a good heuristic for most people, because most people are in the region where information gathering is cheaper than the expected value of gathering information. I don’t think we disagree on anything concrete: I don’t claim that it’s rational in itself a priori but is a fairly good heuristic.
This is a good point about taking into account the costs. I want to cover this idea in my third post which I am still writing, but will probably be something like Principle 3 – your rationality depends on the usefulness of your internal representation of the world. My view is that truth seeking should be viewed as an optimization process. If it doesn’t allow you to become more optimal, then it is not worth it. I have a post about this here.