Jokes aside, some of what EY preaches here IS WRONG, since there is absolutely no way he is right about everything. If someone tells you otherwise, they are treating EY as a cult leader, not a teacher.
I have a not at all short list of things I think Eliezer is wrong on but this seems incorrect. I agree that there’s absolutely no way that Eliezer is right about everything. But everything in the Sequences is a (small) proper subset of everything Eliezer believes. So the notion that everything he has said here is correct isn’t as unreasonable. (That said, there are quite a few issues with things Eliezer has said here including things in the Sequences.)
Until you have proven EY wrong at least once on this forum, you are not ready for rationality.
This sounds disturbingly like the apprentice beating the master and then leaving. This sort of notion always annoys me. The first time the master can beat the apprentice is not the time when the apprentice has nothing left to learn from the master. It simply indicates that marginal returns are likely to start diminishing. For similar reasons we don’t give a PhD in math to someone as soon as they can prove something new that their adviser tried and failed to prove. They need to do a lot more than that.
But everything in the Sequences is a (small) proper subset of everything Eliezer believes. So the notion that everything he has said here is correct isn’t as unreasonable.
The Sequences have been estimated at about 1 million words. I daresay the notion that everything is “correct” there is… unrealistic.
I can even corroborate that notion by pointing out that Eliezer is genetically human; and no human being is immune to the various cognitive biases and other failure modes of rationality; ergo even the best of us will be incorrect on a topic we have established expertise in on ocassion. Even if we assume it happens less frequently in Eliezer than in any other expert in any other topic, I find the notion that there are no errors in a body of work more than twice the length of The Lord of the Rings is one that I assign a vanishingly small probability of being accurate to.
I find the notion that there are no errors in a body of work more than twice the length of The Lord of the Rings is one that I assign a vanishingly small probability of being accurate to.
I find the notion that there are no errors in a body of work more than twice the length of The Lord of the Rings is one that I assign a vanishingly small probability of being accurate to.
This is a really evocative phrasing that helps the point a lot. I’m updating my position accordingly. There’s an extremely high probability that something of the length of the sequences has at least a few things wrong with it. That that probability is less than the probability that there’s a mistake in at least one of someone’e beliefs shouldn’t be that relevant because the underlying probability is still extremely high.
Summary of the major stuff I think Eliezer is wrong on:
Everything Eliezer wrote about blue tentacles is wrong.
Eliezer’s use of phlogiston as an example of a bad hypothesis shows lack of historical knowledge about what was actually believed. Phlogiston was rejected by and large because it had been falsified. So claiming that it was unfalsifiable is incorrect. It is true that some people (especially Joseph Priestly) tried to add additional ad hoc hypotheses to prevent its falsification but they were a tiny minority.
Eliezer drastically overestimates the difference between “traditional rationalism” and “extreme rationalism”.
Eliezer underestimates how many physicists take MWI seriously, yet at the same time ignores that many people who have thought about the same issues as he has and know a lot more about it than he does have not accepted it.
Eliezer’s negative opinion about academia is by and large inaccurate and unjustified and to some extent seems to extend from stereotypes of it that aren’t really accurate and his own lack of experience with it.
Eliezer has massive nostalgia for the science and the attitudes about science from the 1960s or so that are deeply unjustified.
Eliezer massively overestimates the chance of an intelligence explosion occurring, primarily because he doesn’t take into account how difficult software optimization is and how much theoretical compsci puts limits on it, and he underestimates how much technical difficulty is involved in serious nanotech.
Not posts but comments he has made. All such comments are actually pretty recent so they may be functions of recent viewpoints. There seems to be hints of this in some older remarks but 1 and 2 are recent extreme examples. Curiously, judging from the karma, a lot of the community disagreed with Eliezer on the first claim but a lot agreed with him on the second.
I agree that there’s absolutely no way that Eliezer is right about everything. But everything in the Sequences is a (small) proper subset of everything Eliezer believes. So the notion that everything he has said here is correct isn’t as unreasonable.
There’s nothing wrong with this discussion as such, but it is of no practical relevance. Regardless of whether or not errors were written into the sequences, errors are read out of them. I’ve been surprised and frustrated by people’s stupidity, such as the mass misunderstanding of the Allais paradox or dust specks, but I probably am misinterpreting something that would be obvious to someone smarter. This might even be on an issue where I am right and only wrongly think I disagree with others.
Sure. My argument wasn’t that there isn’t things wrong with the Sequences but that it wasn’t completely unreasonable to to think that there would be no mistakes in a work of that length. I already stated in my post that I thought there were problems with what Eliezer has said. But in any event, see my reply to Logos where he convinced that it is still extremely unlikely for something of this length to not have mistakes.
I have a not at all short list of things I think Eliezer is wrong on but this seems incorrect. I agree that there’s absolutely no way that Eliezer is right about everything. But everything in the Sequences is a (small) proper subset of everything Eliezer believes. So the notion that everything he has said here is correct isn’t as unreasonable.(That said, there are quite a few issues with things Eliezer has said here including things in the Sequences.)
I think it really is rather unreasonable. Take a human, no matter how smart or rational, and have them write one blog post per day for several years, much of which is on debated topics, and I would be shocked if nothing that they said turned out to be false. Even given the relative smallness of EY’s beliefs in the sequences/EY’s beliefs in general, it’s still rather unlikely. Every one of his mistakes would have had to have been regarding something other than what he posted about, which is a bit much for my inner skeptic.
Even given the relative smallness of EY’s beliefs in the sequences/EY’s beliefs in general, it’s still rather unlikely.
It is more reasonable that you have no errors in a smaller sampling size of your beliefs than in a larger sampling size, but the probability of there being at least one error increases with the size of the beliefs being sampled.
I don’t know that you and JoshuaZ are really disagreeing with one another so much as you are taking alternate perspectives on the same set of data.
I have a not at all short list of things I think Eliezer is wrong on but this seems incorrect. I agree that there’s absolutely no way that Eliezer is right about everything. But everything in the Sequences is a (small) proper subset of everything Eliezer believes. So the notion that everything he has said here is correct isn’t as unreasonable. (That said, there are quite a few issues with things Eliezer has said here including things in the Sequences.)
This sounds disturbingly like the apprentice beating the master and then leaving. This sort of notion always annoys me. The first time the master can beat the apprentice is not the time when the apprentice has nothing left to learn from the master. It simply indicates that marginal returns are likely to start diminishing. For similar reasons we don’t give a PhD in math to someone as soon as they can prove something new that their adviser tried and failed to prove. They need to do a lot more than that.
The Sequences have been estimated at about 1 million words. I daresay the notion that everything is “correct” there is… unrealistic.
I can even corroborate that notion by pointing out that Eliezer is genetically human; and no human being is immune to the various cognitive biases and other failure modes of rationality; ergo even the best of us will be incorrect on a topic we have established expertise in on ocassion. Even if we assume it happens less frequently in Eliezer than in any other expert in any other topic, I find the notion that there are no errors in a body of work more than twice the length of The Lord of the Rings is one that I assign a vanishingly small probability of being accurate to.
Especially since the Sequences were written as an exercise in “just getting a post out of the door” instead of spending a long time thinking about and revising each post.
This is a really evocative phrasing that helps the point a lot. I’m updating my position accordingly. There’s an extremely high probability that something of the length of the sequences has at least a few things wrong with it. That that probability is less than the probability that there’s a mistake in at least one of someone’e beliefs shouldn’t be that relevant because the underlying probability is still extremely high.
Mind posting it?
Summary of the major stuff I think Eliezer is wrong on:
Everything Eliezer wrote about blue tentacles is wrong.
Eliezer’s use of phlogiston as an example of a bad hypothesis shows lack of historical knowledge about what was actually believed. Phlogiston was rejected by and large because it had been falsified. So claiming that it was unfalsifiable is incorrect. It is true that some people (especially Joseph Priestly) tried to add additional ad hoc hypotheses to prevent its falsification but they were a tiny minority.
Eliezer drastically overestimates the difference between “traditional rationalism” and “extreme rationalism”.
Eliezer underestimates how many physicists take MWI seriously, yet at the same time ignores that many people who have thought about the same issues as he has and know a lot more about it than he does have not accepted it.
Eliezer’s negative opinion about academia is by and large inaccurate and unjustified and to some extent seems to extend from stereotypes of it that aren’t really accurate and his own lack of experience with it.
Eliezer has massive nostalgia for the science and the attitudes about science from the 1960s or so that are deeply unjustified.
Eliezer massively overestimates the chance of an intelligence explosion occurring, primarily because he doesn’t take into account how difficult software optimization is and how much theoretical compsci puts limits on it, and he underestimates how much technical difficulty is involved in serious nanotech.
Can you expand on this? Which posts are you referring to?
Not posts but comments he has made. All such comments are actually pretty recent so they may be functions of recent viewpoints. There seems to be hints of this in some older remarks but 1 and 2 are recent extreme examples. Curiously, judging from the karma, a lot of the community disagreed with Eliezer on the first claim but a lot agreed with him on the second.
Ok, I give. Where does Eliezer talk about blue tentacles?
In Some Claims Are Just Too Extraordinary. I’m not sure if he still believes this, though, since it seems to contradict the spirit of How To Convince Me That 2 + 2 = 3.
Note that the original whole “blue tentacle” thing is from A Technical Explanation of Technical Explanation.
Thanks, I’d forgotten about that.
There’s nothing wrong with this discussion as such, but it is of no practical relevance. Regardless of whether or not errors were written into the sequences, errors are read out of them. I’ve been surprised and frustrated by people’s stupidity, such as the mass misunderstanding of the Allais paradox or dust specks, but I probably am misinterpreting something that would be obvious to someone smarter. This might even be on an issue where I am right and only wrongly think I disagree with others.
At least once Eliezer has posted a factual error and been corrected and said “oops”.
Sure. My argument wasn’t that there isn’t things wrong with the Sequences but that it wasn’t completely unreasonable to to think that there would be no mistakes in a work of that length. I already stated in my post that I thought there were problems with what Eliezer has said. But in any event, see my reply to Logos where he convinced that it is still extremely unlikely for something of this length to not have mistakes.
I think it really is rather unreasonable. Take a human, no matter how smart or rational, and have them write one blog post per day for several years, much of which is on debated topics, and I would be shocked if nothing that they said turned out to be false. Even given the relative smallness of EY’s beliefs in the sequences/EY’s beliefs in general, it’s still rather unlikely. Every one of his mistakes would have had to have been regarding something other than what he posted about, which is a bit much for my inner skeptic.
It is more reasonable that you have no errors in a smaller sampling size of your beliefs than in a larger sampling size, but the probability of there being at least one error increases with the size of the beliefs being sampled.
I don’t know that you and JoshuaZ are really disagreeing with one another so much as you are taking alternate perspectives on the same set of data.