Even without a coherent theory of planetary motion, we can assign a very low probabiliy to the earth crashing into the sun simply on the basis that it hasn’t yet.
...that is exactly the sort of judgment which requires some sort of theory. Every day, trillions of things happen which have never happened before. Never in the history of the universe has this comment been posted to LW!
‘Unexpected things happen all the time’ isn’t necessarily a reason to be less surprised by specific especially unexpected things. The reason Eliezer’s post isn’t crazy super surprising isn’t that surprising things are (surprisingly?) common; it’s that it’s relatively ordinary (in-character, etc.) for its reference class.
Except surprising things are surprisingly common. Most people overestimate the likelihood that their model is correct.
But this doesn’t seem like a great example of that, yeah. I was sort of pattern-matching this into the wider discussion (is it worth figuring out if the earth will crash into the sun?)
Even without a coherent theory of planetary motion, we can assign a very low probabiliy to the earth crashing into the sun simply on the basis that it hasn’t yet.
...that is exactly the sort of judgment which requires some sort of theory. Every day, trillions of things happen which have never happened before. Never in the history of the universe has this comment been posted to LW!
Well, a couple days ago, we could reasonably have assigned a pretty low probability to that exact post being made today.
New things happen all the time, but without a model, we can’t assign much likelihood to any specific new thing happening at any particular time.
Without some kind of model, you can’t assign any probabilities period (.)
And yet it was made, regardless. People get hit with Black Swans all the time.
That’s not what a Black Swan is.
Huh. Looks like I’ve been misusing it all this time. Thanks!
‘Unexpected things happen all the time’ isn’t necessarily a reason to be less surprised by specific especially unexpected things. The reason Eliezer’s post isn’t crazy super surprising isn’t that surprising things are (surprisingly?) common; it’s that it’s relatively ordinary (in-character, etc.) for its reference class.
Except surprising things are surprisingly common. Most people overestimate the likelihood that their model is correct.
But this doesn’t seem like a great example of that, yeah. I was sort of pattern-matching this into the wider discussion (is it worth figuring out if the earth will crash into the sun?)