(If it’s unknowable, how can we know that a certain prediction strategy is going to be systematically biased in a known direction? Biased with respect to what knowable standard?)
I forget how much detail there is on this later in this talk, but it is in his book. The systematic bias towards pessimism is due to the method of trying to imagine the future using today’s knowledge (which is less than the future’s knowledge).
Quoting Deutsch from The Beginning of Infinity:
Trying to know the unknowable leads inexorably to error and self-deception. Among other things, it creates a bias towards pessimism. For example, in 1894, the physicist Albert Michelson made the following prophecy about the future of physics:
The more important fundamental laws and facts of physical science have all been discovered, and these are now so firmly established that the possibility of their ever being supplanted in consequence of new discoveries is exceedingly remote. … Our future discoveries must be looked for in the sixth place of decimals.
(Albert Michelson, address at the opening of the Ryerson Physical Laboratory, University of Chicago, 1894)
What exactly was Michelson doing when he judged that there was only an ‘exceedingly remote’ chance that the foundations of physics as he knew them would ever be superseded? He was prophesying the future. How? On the basis of the best knowledge available at the time. But that consisted of the physics of 1894! Powerful and accurate though it was in countless applications, it was not capable of predicting the content of its successors. It was poorly suited even to imagining the changes that relativity and quantum theory would bring – which is why the physicists who did imagine them won Nobel prizes. Michelson would not have put the expansion of the universe, or the existence of parallel universes, or the non-existence of the force of gravity, on any list of possible discoveries whose probability was ‘exceedingly remote’. He just didn’t conceive of them at all.
It’s inconsistent to expect the future to be better than one expects. If you think your probability estimates are too pessimistic adjust them until you don’t know whether they are too optimistic or too pessimistic. No one stops you from assigning probability mass to outcomes like “technological solution that does away with problem X” or “scientific insight that makes the question moot”. Claimed knowledge that the best possible probability estimate is biased in a particular direction cannot possibly ever be correct.
I forget how much detail there is on this later in this talk, but it is in his book. The systematic bias towards pessimism is due to the method of trying to imagine the future using today’s knowledge (which is less than the future’s knowledge).
Quoting Deutsch from The Beginning of Infinity:
It’s inconsistent to expect the future to be better than one expects. If you think your probability estimates are too pessimistic adjust them until you don’t know whether they are too optimistic or too pessimistic. No one stops you from assigning probability mass to outcomes like “technological solution that does away with problem X” or “scientific insight that makes the question moot”. Claimed knowledge that the best possible probability estimate is biased in a particular direction cannot possibly ever be correct.