… appeal to consequences. Well, that is new in this conversation. It’s not very constructive though.
If your conception of rationality leads to worse consequences than doing something differently, you should do something differently. Do you think it’s impossible to do better than resigning yourself to the destruction of New York?
That’s not what a trust-system is. It is, simply put, the practice of trusting that something is so because the expected utility-cost of being wrong is lower than the expected utility-cost of investigating a given claim. This practice is a foible; a failing—one that is engaged in out of necessity because humans have a limit to their available cognitive resources.
The utility cost of being wrong can fluctuate. Your life may hinge tomorrow on a piece of information you did not consider investigating today. If you find yourself in a situation where you must make an important decision hinging on little information, you can do no better than your best estimate, but if you decide that you are not justified in holding forth an estimate at all, you will have rationalized yourself into helplessness.
Humans have bounded rationality. Computationally optimized Jupiter Brains have bounded rationality. Nothing can have unlimited cognitive resources in this universe, but with high levels of computational power and effective weighting of evidence it is possible to know how much confidence you should have based on any given amount of information.
Finally: you’re introducing another unlike-variable by abstracting from individual instances to averaged aggregate.
You can get the expected number of true statements just by adding the probabilities of truth of each statement. It’s like judging how many heads you should expect to get in a series of coin flips. .5 + .5 + .5..… The same formula works even if the probabilities are not all the same.
If your conception of rationality leads to worse consequences than doing something differently, you should do something differently. Do you think it’s impossible to do better than resigning yourself to the destruction of New York?
The utility cost of being wrong can fluctuate. Your life may hinge tomorrow on a piece of information you did not consider investigating today. If you find yourself in a situation where you must make an important decision hinging on little information, you can do no better than your best estimate, but if you decide that you are not justified in holding forth an estimate at all, you will have rationalized yourself into helplessness.
Humans have bounded rationality. Computationally optimized Jupiter Brains have bounded rationality. Nothing can have unlimited cognitive resources in this universe, but with high levels of computational power and effective weighting of evidence it is possible to know how much confidence you should have based on any given amount of information.
You can get the expected number of true statements just by adding the probabilities of truth of each statement. It’s like judging how many heads you should expect to get in a series of coin flips. .5 + .5 + .5..… The same formula works even if the probabilities are not all the same.
Apparently not.