While I believe that prediction is a (very useful) rationalist skill, it’s not the filter I would use to evaluate rationality. I suppose I think the skill is too difficult by several levels, and then too specialized. I don’t expect someone in the top .05% to know better what is going to happen, but I expect them to know better what to do when it happens.
… this is just based on a ‘large quantity of forgotten evidence colored by our experience and aggregated using intuition’ but then I consider some concrete examples to see if they fit...
.. I don’t expect a rational president to predict what will happen in Libya, but I expect him to be have a good idea of which political theories to apply to the situation to make the best outcome most probable. It seems to be a different, weird kind of intelligence to anticipate that, say, one personality will form an alliance with another personality and they will cause event X that determines the outcome.
.. I had a friend in college that was very smart and she’s a vet now; I expect her to be able to figure out whatever is wrong with whatever animal that comes to her clinic, including problems she hasn’t seen before, but I don’t expect she’d be able to predict much about anything she hasn’t seen before. She’s be able to make some educated guesses based on what she knows, but, again, something she doesn’t anticipate could easily eclipse her expectation.
I am not suggesting that prediction is the filter. Predictions of a certain kind (will policy change X be good or bad for the organization? Should we use our funds in this way or that way?) are a necessary part of doing business. If you want to know what is wrong with an animal who has come into a clinic, you want a prediction (either about the results of examinations, or about responses to treatment). Somehow an organization needs to make that prediction: it can give the authority to one person, it can let several people argue about it, etc.
While I believe that prediction is a (very useful) rationalist skill, it’s not the filter I would use to evaluate rationality. I suppose I think the skill is too difficult by several levels, and then too specialized. I don’t expect someone in the top .05% to know better what is going to happen, but I expect them to know better what to do when it happens.
… this is just based on a ‘large quantity of forgotten evidence colored by our experience and aggregated using intuition’ but then I consider some concrete examples to see if they fit...
.. I don’t expect a rational president to predict what will happen in Libya, but I expect him to be have a good idea of which political theories to apply to the situation to make the best outcome most probable. It seems to be a different, weird kind of intelligence to anticipate that, say, one personality will form an alliance with another personality and they will cause event X that determines the outcome.
.. I had a friend in college that was very smart and she’s a vet now; I expect her to be able to figure out whatever is wrong with whatever animal that comes to her clinic, including problems she hasn’t seen before, but I don’t expect she’d be able to predict much about anything she hasn’t seen before. She’s be able to make some educated guesses based on what she knows, but, again, something she doesn’t anticipate could easily eclipse her expectation.
I am not suggesting that prediction is the filter. Predictions of a certain kind (will policy change X be good or bad for the organization? Should we use our funds in this way or that way?) are a necessary part of doing business. If you want to know what is wrong with an animal who has come into a clinic, you want a prediction (either about the results of examinations, or about responses to treatment). Somehow an organization needs to make that prediction: it can give the authority to one person, it can let several people argue about it, etc.