But everything in the Sequences is a (small) proper subset of everything Eliezer believes. So the notion that everything he has said here is correct isn’t as unreasonable.
The Sequences have been estimated at about 1 million words. I daresay the notion that everything is “correct” there is… unrealistic.
I can even corroborate that notion by pointing out that Eliezer is genetically human; and no human being is immune to the various cognitive biases and other failure modes of rationality; ergo even the best of us will be incorrect on a topic we have established expertise in on ocassion. Even if we assume it happens less frequently in Eliezer than in any other expert in any other topic, I find the notion that there are no errors in a body of work more than twice the length of The Lord of the Rings is one that I assign a vanishingly small probability of being accurate to.
I find the notion that there are no errors in a body of work more than twice the length of The Lord of the Rings is one that I assign a vanishingly small probability of being accurate to.
I find the notion that there are no errors in a body of work more than twice the length of The Lord of the Rings is one that I assign a vanishingly small probability of being accurate to.
This is a really evocative phrasing that helps the point a lot. I’m updating my position accordingly. There’s an extremely high probability that something of the length of the sequences has at least a few things wrong with it. That that probability is less than the probability that there’s a mistake in at least one of someone’e beliefs shouldn’t be that relevant because the underlying probability is still extremely high.
The Sequences have been estimated at about 1 million words. I daresay the notion that everything is “correct” there is… unrealistic.
I can even corroborate that notion by pointing out that Eliezer is genetically human; and no human being is immune to the various cognitive biases and other failure modes of rationality; ergo even the best of us will be incorrect on a topic we have established expertise in on ocassion. Even if we assume it happens less frequently in Eliezer than in any other expert in any other topic, I find the notion that there are no errors in a body of work more than twice the length of The Lord of the Rings is one that I assign a vanishingly small probability of being accurate to.
Especially since the Sequences were written as an exercise in “just getting a post out of the door” instead of spending a long time thinking about and revising each post.
This is a really evocative phrasing that helps the point a lot. I’m updating my position accordingly. There’s an extremely high probability that something of the length of the sequences has at least a few things wrong with it. That that probability is less than the probability that there’s a mistake in at least one of someone’e beliefs shouldn’t be that relevant because the underlying probability is still extremely high.