From what I understood on reading the Wikipedia article on Bayesian probability and inferring from how he writes (and correct me if I’m wrong), Eliezer is talking about your “subjective probability.” You are a being, have consciousness, and interpret input as information. Given a lot of this information, you’ve formed an idea that 7 is prime. You’ve also formed an idea that other people exist, and that the sky is blue, which also have a high subjective probability in your mind because you have a lot of direct information to sustain that belief.
Moreover, if you’ve ever been wrong before, hopefully you’ve noticed that you have been wrong before. That’s a little information that “you are sometimes wrong about things that you are very sure of”. So, you might apply this information to your formula of your probability of the idea that “7 is prime”, so you still end up with a high probability, but not 1.
Now, you might not think that “you are sometimes wrong about things that you are sure of” about every single subject, such as primeness. But, what if you had the information that other humans, smart people, have at some point in the past, incorrectly understood the primeness of a number (the anecdote). You might state, that “human beings are sometimes wrong about the primeness of a number,” and “I am a human being.” Again, if you include that information in your calculation of the probability that the idea that “7 is prime” is true, then you end up with a high probability, but not 1.
(Oh, but what if you didn’t make the statement “human beings are sometimes wrong about the primeness of a number”, but instead, “this idiot is sometimes wrong about the primeness of a number, but I am never” Well, you can. That’s one big problem with Bayesian subjective probabilities. How do we generalize? How can we formalize it so that two people with the same information deterministically get the same probability? Logical (or objective epistemic) probability attempts to answer these questions.)
So, you’re right that it is just “a single person” getting it wrong, that his cerainty was incorrect. But that’s Eliezer’s point. We are not supreme beings lording over all reality, we are humans who have memorized some information from the past and made some generalizations, including generalizations that sometimes our generalizations are wrong.
Brent,
From what I understood on reading the Wikipedia article on Bayesian probability and inferring from how he writes (and correct me if I’m wrong), Eliezer is talking about your “subjective probability.” You are a being, have consciousness, and interpret input as information. Given a lot of this information, you’ve formed an idea that 7 is prime. You’ve also formed an idea that other people exist, and that the sky is blue, which also have a high subjective probability in your mind because you have a lot of direct information to sustain that belief.
Moreover, if you’ve ever been wrong before, hopefully you’ve noticed that you have been wrong before. That’s a little information that “you are sometimes wrong about things that you are very sure of”. So, you might apply this information to your formula of your probability of the idea that “7 is prime”, so you still end up with a high probability, but not 1.
Now, you might not think that “you are sometimes wrong about things that you are sure of” about every single subject, such as primeness. But, what if you had the information that other humans, smart people, have at some point in the past, incorrectly understood the primeness of a number (the anecdote). You might state, that “human beings are sometimes wrong about the primeness of a number,” and “I am a human being.” Again, if you include that information in your calculation of the probability that the idea that “7 is prime” is true, then you end up with a high probability, but not 1.
(Oh, but what if you didn’t make the statement “human beings are sometimes wrong about the primeness of a number”, but instead, “this idiot is sometimes wrong about the primeness of a number, but I am never” Well, you can. That’s one big problem with Bayesian subjective probabilities. How do we generalize? How can we formalize it so that two people with the same information deterministically get the same probability? Logical (or objective epistemic) probability attempts to answer these questions.)
So, you’re right that it is just “a single person” getting it wrong, that his cerainty was incorrect. But that’s Eliezer’s point. We are not supreme beings lording over all reality, we are humans who have memorized some information from the past and made some generalizations, including generalizations that sometimes our generalizations are wrong.