It always seems to me that any little disclaimer about my degree of certainty seems to disproportionately skew the way others interpret my statements.
For instance, if I’m 90% sure of something, and carefully state it in a way that illustrates my level of confidence (as distinct from 100%), people seem to react as if I’m substantially less than 90% confident. In other words, any acknowledgement of less-than-100%-confidence seems to be interpreted as not-very-confident-at-all.
I find a similar effect. It looks to me like most people systematically overstate probabilistic claims above their overestimation of certainty.
So that when they say P(?) = C, their internal estimate of P(?) = C(1-delta), while the long run expectation when they say P(?) = C is more like E(?) = C(1-delta)(1-gamma).
So when you say it, they downgrade what you say by (1-delta).
Kind of a Gresham’s law for probabilistic predictions—over confident predictions drive out appropriately confident predictions.
It always seems to me that any little disclaimer about my degree of certainty seems to disproportionately skew the way others interpret my statements.
For instance, if I’m 90% sure of something, and carefully state it in a way that illustrates my level of confidence (as distinct from 100%), people seem to react as if I’m substantially less than 90% confident. In other words, any acknowledgement of less-than-100%-confidence seems to be interpreted as not-very-confident-at-all.
I find a similar effect. It looks to me like most people systematically overstate probabilistic claims above their overestimation of certainty.
So that when they say P(?) = C, their internal estimate of P(?) = C(1-delta), while the long run expectation when they say P(?) = C is more like E(?) = C(1-delta)(1-gamma).
So when you say it, they downgrade what you say by (1-delta).
Kind of a Gresham’s law for probabilistic predictions—over confident predictions drive out appropriately confident predictions.
Evolution is just a theory!