I often add “I believe” to sentences to clarify that I am not certain.
“Did you feed the dog?”
“Yes”
and
“Did you feed the dog?”
“I believe so”
have different meanings to me. I parse the first as “I am highly confident that I fed the dog” and the second as “I am unable to remember for sure whether I fed the dog, but I am >50% confident I did so.”
It always seems to me that any little disclaimer about my degree of certainty seems to disproportionately skew the way others interpret my statements.
For instance, if I’m 90% sure of something, and carefully state it in a way that illustrates my level of confidence (as distinct from 100%), people seem to react as if I’m substantially less than 90% confident. In other words, any acknowledgement of less-than-100%-confidence seems to be interpreted as not-very-confident-at-all.
I find a similar effect. It looks to me like most people systematically overstate probabilistic claims above their overestimation of certainty.
So that when they say P(?) = C, their internal estimate of P(?) = C(1-delta), while the long run expectation when they say P(?) = C is more like E(?) = C(1-delta)(1-gamma).
So when you say it, they downgrade what you say by (1-delta).
Kind of a Gresham’s law for probabilistic predictions—over confident predictions drive out appropriately confident predictions.
I often add “I believe” to sentences to clarify that I am not certain.
“Did you feed the dog?” “Yes”
and
“Did you feed the dog?” “I believe so”
have different meanings to me. I parse the first as “I am highly confident that I fed the dog” and the second as “I am unable to remember for sure whether I fed the dog, but I am >50% confident I did so.”
It always seems to me that any little disclaimer about my degree of certainty seems to disproportionately skew the way others interpret my statements.
For instance, if I’m 90% sure of something, and carefully state it in a way that illustrates my level of confidence (as distinct from 100%), people seem to react as if I’m substantially less than 90% confident. In other words, any acknowledgement of less-than-100%-confidence seems to be interpreted as not-very-confident-at-all.
I find a similar effect. It looks to me like most people systematically overstate probabilistic claims above their overestimation of certainty.
So that when they say P(?) = C, their internal estimate of P(?) = C(1-delta), while the long run expectation when they say P(?) = C is more like E(?) = C(1-delta)(1-gamma).
So when you say it, they downgrade what you say by (1-delta).
Kind of a Gresham’s law for probabilistic predictions—over confident predictions drive out appropriately confident predictions.
Evolution is just a theory!