Love the logic and the scale, although I think Vladimir_M pokes some important holes specifically at the 10^(-2) to 10^(-3) level.
May I suggest “un-planned for errors?” In my experience, it is not useful to plan for contingencies with about a 1⁄300 chance in happening per trial. For example, on any given day of the year, my favorite cafe might be closed due to the owner’s illness, but I do not call the cafe first to confirm that it is open each time I go there. At any given time, one of my 300-ish acquaintances is probably nursing a grudge against me, but I do not bother to open each conversation with “Hi, do you still like me today?” When, as inevitably happens, I run into a closed cafe or a hostile friend, I usually stop short for a bit; my planning mechanism reports a bug; there is no ‘action string’ cached for that situation, for the simple reason that I was not expecting the situation, because I did not plan for the situation, because that is how rare it is. Nevertheless, I am not ‘surprised’—I know at some level that things that happen about 1⁄300 times are sort of prone to happening once in a while. On the other hand, I would be ‘surprised’ if my favorite cafe had been burned to the ground or if my erstwhile buddy had taken a permanent vow of silence. I expect that these things will never happen to me, and so if they happen I go and double-check my calculations and assumptions, because it seems equally likely that I am wrong about my assumptions and that the 1⁄30,000 event would actually occur. Anyway, the point is that a category 3 event is an event that makes you shut up for a moment but doesn’t make you reexamine any core beliefs.
If you hold most of your core beliefs with probability > .993 then you are almost certainly overconfident in your core beliefs. I’m not talking about stuff like “my senses offer moderately reliable evidence” or “F(g) = GMm/(r^2)”; I’m talking about stuff like “Solominoff induction predicts that hyperintelligent AIs will employ a timeless decision theory.”
Love the logic and the scale, although I think Vladimir_M pokes some important holes specifically at the 10^(-2) to 10^(-3) level.
May I suggest “un-planned for errors?” In my experience, it is not useful to plan for contingencies with about a 1⁄300 chance in happening per trial. For example, on any given day of the year, my favorite cafe might be closed due to the owner’s illness, but I do not call the cafe first to confirm that it is open each time I go there. At any given time, one of my 300-ish acquaintances is probably nursing a grudge against me, but I do not bother to open each conversation with “Hi, do you still like me today?” When, as inevitably happens, I run into a closed cafe or a hostile friend, I usually stop short for a bit; my planning mechanism reports a bug; there is no ‘action string’ cached for that situation, for the simple reason that I was not expecting the situation, because I did not plan for the situation, because that is how rare it is. Nevertheless, I am not ‘surprised’—I know at some level that things that happen about 1⁄300 times are sort of prone to happening once in a while. On the other hand, I would be ‘surprised’ if my favorite cafe had been burned to the ground or if my erstwhile buddy had taken a permanent vow of silence. I expect that these things will never happen to me, and so if they happen I go and double-check my calculations and assumptions, because it seems equally likely that I am wrong about my assumptions and that the 1⁄30,000 event would actually occur. Anyway, the point is that a category 3 event is an event that makes you shut up for a moment but doesn’t make you reexamine any core beliefs.
If you hold most of your core beliefs with probability > .993 then you are almost certainly overconfident in your core beliefs. I’m not talking about stuff like “my senses offer moderately reliable evidence” or “F(g) = GMm/(r^2)”; I’m talking about stuff like “Solominoff induction predicts that hyperintelligent AIs will employ a timeless decision theory.”