Another way to frame this: correct for biases in your sensitivity to new information.
Enron was too insensitive to new information. It biased itself towards insensitivity by rewarding those who stuck to the party line.
Conversely, a founder who gives up after hearing a few ’no’s from investors is likely too sensitive to new information. They’re biased in the opposite direction: it’s often easier to give up than to trudge on.
Eliezer’s point is that most of us are too insensitive to new information because it’s painful to admit that we were wrong. I can agree with this, but it’s also not a universal truth because there are times where it’s painful to admit that we were right. The universal truth is that it’s good to correct for biases in your sensitivity to new information.
Examples:
Alice moves to Examplestan, where everyone believes that 3+2=7. Alice should be less sensitive to new information on the sum of 3 and 2 because it’s easier to conform.
Bob is a fervent ateapotist (someone who doesn’t believe that there’s a teapot between Earth and Mars), has mostly ateapotist friends, and heads up the local ateapotist club. If NASA publishes new images of a teapot between Earth and Mars, Bob should be more sensitive to those images than he’s inclined to be because it’s easier to stick with what he already believes (and retain his friends) than to discard what he believes (and lose his friends).
Another way to frame this: correct for biases in your sensitivity to new information.
Enron was too insensitive to new information. It biased itself towards insensitivity by rewarding those who stuck to the party line.
Conversely, a founder who gives up after hearing a few ’no’s from investors is likely too sensitive to new information. They’re biased in the opposite direction: it’s often easier to give up than to trudge on.
Eliezer’s point is that most of us are too insensitive to new information because it’s painful to admit that we were wrong. I can agree with this, but it’s also not a universal truth because there are times where it’s painful to admit that we were right. The universal truth is that it’s good to correct for biases in your sensitivity to new information.
Examples:
Alice moves to Examplestan, where everyone believes that 3+2=7. Alice should be less sensitive to new information on the sum of 3 and 2 because it’s easier to conform.
Bob is a fervent ateapotist (someone who doesn’t believe that there’s a teapot between Earth and Mars), has mostly ateapotist friends, and heads up the local ateapotist club. If NASA publishes new images of a teapot between Earth and Mars, Bob should be more sensitive to those images than he’s inclined to be because it’s easier to stick with what he already believes (and retain his friends) than to discard what he believes (and lose his friends).