Which you can see being done by a LW regular in the LW post Thinking Bayesianically, with Lojban. So it’s not like this is something no one does, or something only idiots do.
I’m being slightly unfair. The actual figure being described in those terms is nearer to 76%.
I like the principle, but 5% is “extremely unlikely”? Something that happens on the way to work once every three weeks?
It can be a bit scary, but in a lot of domains that’s exactly what people mean when they say extremly unlikely.
It’s extremly unlikely that humans aren’t responsible for global warming.
And it’s not even as scary as people saying “beyond a reasonable doubt” to mean something like ‘P > 75%’.
Which you can see being done by a LW regular in the LW post Thinking Bayesianically, with Lojban. So it’s not like this is something no one does, or something only idiots do.
I’m being slightly unfair. The actual figure being described in those terms is nearer to 76%.