One morning, walking to the train station, thinking about something I read, my thoughts wondered to how this all affects my faith. And I noticed myself flinching away, and thought “Isn’t this what Eliezer calls “flinching away”?” I didn’t resolve my doubts there and then, but there was no turning back and couple of days later I was an atheist.
I recall a Gom Jabbar spell cast on a hapless teacher in a similar circumstance.
Jokes aside, some of what EY preaches here IS WRONG, since there is absolutely no way he is right about everything. If someone tells you otherwise, they are treating EY as a cult leader, not a teacher. So, ask yourself: what if the idea you just thought over and internalized is wrong? Because, chances are, at least one of them is. If there is a topic in the sequences you consider yourself an expert in, start there. It might be his approach to free will, or to quantum mechanics, or to the fun theory, or to dark arts, or…
Until you have proven EY wrong at least once on this forum, you are not ready for rationality.
Jokes aside, some of what EY preaches here IS WRONG, since there is absolutely no way he is right about everything. If someone tells you otherwise, they are treating EY as a cult leader, not a teacher.
I have a not at all short list of things I think Eliezer is wrong on but this seems incorrect. I agree that there’s absolutely no way that Eliezer is right about everything. But everything in the Sequences is a (small) proper subset of everything Eliezer believes. So the notion that everything he has said here is correct isn’t as unreasonable. (That said, there are quite a few issues with things Eliezer has said here including things in the Sequences.)
Until you have proven EY wrong at least once on this forum, you are not ready for rationality.
This sounds disturbingly like the apprentice beating the master and then leaving. This sort of notion always annoys me. The first time the master can beat the apprentice is not the time when the apprentice has nothing left to learn from the master. It simply indicates that marginal returns are likely to start diminishing. For similar reasons we don’t give a PhD in math to someone as soon as they can prove something new that their adviser tried and failed to prove. They need to do a lot more than that.
But everything in the Sequences is a (small) proper subset of everything Eliezer believes. So the notion that everything he has said here is correct isn’t as unreasonable.
The Sequences have been estimated at about 1 million words. I daresay the notion that everything is “correct” there is… unrealistic.
I can even corroborate that notion by pointing out that Eliezer is genetically human; and no human being is immune to the various cognitive biases and other failure modes of rationality; ergo even the best of us will be incorrect on a topic we have established expertise in on ocassion. Even if we assume it happens less frequently in Eliezer than in any other expert in any other topic, I find the notion that there are no errors in a body of work more than twice the length of The Lord of the Rings is one that I assign a vanishingly small probability of being accurate to.
I find the notion that there are no errors in a body of work more than twice the length of The Lord of the Rings is one that I assign a vanishingly small probability of being accurate to.
I find the notion that there are no errors in a body of work more than twice the length of The Lord of the Rings is one that I assign a vanishingly small probability of being accurate to.
This is a really evocative phrasing that helps the point a lot. I’m updating my position accordingly. There’s an extremely high probability that something of the length of the sequences has at least a few things wrong with it. That that probability is less than the probability that there’s a mistake in at least one of someone’e beliefs shouldn’t be that relevant because the underlying probability is still extremely high.
Summary of the major stuff I think Eliezer is wrong on:
Everything Eliezer wrote about blue tentacles is wrong.
Eliezer’s use of phlogiston as an example of a bad hypothesis shows lack of historical knowledge about what was actually believed. Phlogiston was rejected by and large because it had been falsified. So claiming that it was unfalsifiable is incorrect. It is true that some people (especially Joseph Priestly) tried to add additional ad hoc hypotheses to prevent its falsification but they were a tiny minority.
Eliezer drastically overestimates the difference between “traditional rationalism” and “extreme rationalism”.
Eliezer underestimates how many physicists take MWI seriously, yet at the same time ignores that many people who have thought about the same issues as he has and know a lot more about it than he does have not accepted it.
Eliezer’s negative opinion about academia is by and large inaccurate and unjustified and to some extent seems to extend from stereotypes of it that aren’t really accurate and his own lack of experience with it.
Eliezer has massive nostalgia for the science and the attitudes about science from the 1960s or so that are deeply unjustified.
Eliezer massively overestimates the chance of an intelligence explosion occurring, primarily because he doesn’t take into account how difficult software optimization is and how much theoretical compsci puts limits on it, and he underestimates how much technical difficulty is involved in serious nanotech.
Not posts but comments he has made. All such comments are actually pretty recent so they may be functions of recent viewpoints. There seems to be hints of this in some older remarks but 1 and 2 are recent extreme examples. Curiously, judging from the karma, a lot of the community disagreed with Eliezer on the first claim but a lot agreed with him on the second.
I agree that there’s absolutely no way that Eliezer is right about everything. But everything in the Sequences is a (small) proper subset of everything Eliezer believes. So the notion that everything he has said here is correct isn’t as unreasonable.
There’s nothing wrong with this discussion as such, but it is of no practical relevance. Regardless of whether or not errors were written into the sequences, errors are read out of them. I’ve been surprised and frustrated by people’s stupidity, such as the mass misunderstanding of the Allais paradox or dust specks, but I probably am misinterpreting something that would be obvious to someone smarter. This might even be on an issue where I am right and only wrongly think I disagree with others.
Sure. My argument wasn’t that there isn’t things wrong with the Sequences but that it wasn’t completely unreasonable to to think that there would be no mistakes in a work of that length. I already stated in my post that I thought there were problems with what Eliezer has said. But in any event, see my reply to Logos where he convinced that it is still extremely unlikely for something of this length to not have mistakes.
I have a not at all short list of things I think Eliezer is wrong on but this seems incorrect. I agree that there’s absolutely no way that Eliezer is right about everything. But everything in the Sequences is a (small) proper subset of everything Eliezer believes. So the notion that everything he has said here is correct isn’t as unreasonable.(That said, there are quite a few issues with things Eliezer has said here including things in the Sequences.)
I think it really is rather unreasonable. Take a human, no matter how smart or rational, and have them write one blog post per day for several years, much of which is on debated topics, and I would be shocked if nothing that they said turned out to be false. Even given the relative smallness of EY’s beliefs in the sequences/EY’s beliefs in general, it’s still rather unlikely. Every one of his mistakes would have had to have been regarding something other than what he posted about, which is a bit much for my inner skeptic.
Even given the relative smallness of EY’s beliefs in the sequences/EY’s beliefs in general, it’s still rather unlikely.
It is more reasonable that you have no errors in a smaller sampling size of your beliefs than in a larger sampling size, but the probability of there being at least one error increases with the size of the beliefs being sampled.
I don’t know that you and JoshuaZ are really disagreeing with one another so much as you are taking alternate perspectives on the same set of data.
While this is true, most math [1] textbooks generally don’t provide verbose treatments
of controversial, unresolved, possibly untestable meta-problems [2],
(where the validity of the conclusions crucially depend on previous
controversial, unresolved, possibly untestable meta-problems.)
[1] String theory textbooks provide a possible anti-example. [2] Metaphysics, metacognition, metaprogramming.
No, it’s not too dark, it is useful to see an even stronger expression of caution. But, it misses the point a bit. It’s not very helpful to know that Eliezer is probably wrong on some things. Neither is finding a mistake here or there. It just doesn’t help.
You see, my goal is to accept and learn fully that which is accurate, and reject (and maybe fix and improve) that which is wrong. Neither one is enough by itself.
How about accepting that some things are neither, but you still have to make a choice? (E.g. inevitability of (u)FAI is untestable, and relies on a number of disputed assumptions and extrapolations. Same with the viability of cryonics.) How do you construct your priors to make a decision you can live with, and how do you deal with the situation where, despite your best priors, you end up being proven wrong?
Now, this is a much better question! And yes, I am thinking a lot on these. But, in some sense this kind of thing bothers me much less: because it is so clear that the issue is unclear, my mind doesn’t try to unconditionally commit it to the belief pool just because I read something exciting about it. And then I know I have to think about it, and look for independent sources etc. (For these two specific problems, I am in a different state of confusion. Cryonics: quite confused; AGI: a bit better, at least I know what my next steps are.)
I commented on MWI once or twice or a dozen times here, a subject dear to my heart, with little interest from the regulars. There are some other topics I mentioned in passing, but not worth getting into here.
Jokes aside, some of what EY preaches here IS WRONG, since there is absolutely no way he is right about everything. If someone tells you otherwise, they are treating EY as a cult leader, not a teacher.
This seems to be a straw-man. Has anyone ever asserted the infallibility of everything Eliezer has posted? Not even the Pope has that going (he only has an infallible hat he can put on), and it seems to be contradicted many times in the posts themselves with notes of edits made. Everything Eliezer has posted being right is substantially less probable than the least likely thing he has posted.
But everything Eliezer has posted doesn’t have to be right for there to be much of value, or to rely with some confidence in a particular assertion being right (particularly after you have read the arguments behind it) - and some of what is written here is fairly uncontroversial.
I recall a Gom Jabbar spell cast on a hapless teacher in a similar circumstance.
Jokes aside, some of what EY preaches here IS WRONG, since there is absolutely no way he is right about everything. If someone tells you otherwise, they are treating EY as a cult leader, not a teacher. So, ask yourself: what if the idea you just thought over and internalized is wrong? Because, chances are, at least one of them is. If there is a topic in the sequences you consider yourself an expert in, start there. It might be his approach to free will, or to quantum mechanics, or to the fun theory, or to dark arts, or…
Until you have proven EY wrong at least once on this forum, you are not ready for rationality.
(Hope this is not too dark for you.)
I have a not at all short list of things I think Eliezer is wrong on but this seems incorrect. I agree that there’s absolutely no way that Eliezer is right about everything. But everything in the Sequences is a (small) proper subset of everything Eliezer believes. So the notion that everything he has said here is correct isn’t as unreasonable. (That said, there are quite a few issues with things Eliezer has said here including things in the Sequences.)
This sounds disturbingly like the apprentice beating the master and then leaving. This sort of notion always annoys me. The first time the master can beat the apprentice is not the time when the apprentice has nothing left to learn from the master. It simply indicates that marginal returns are likely to start diminishing. For similar reasons we don’t give a PhD in math to someone as soon as they can prove something new that their adviser tried and failed to prove. They need to do a lot more than that.
The Sequences have been estimated at about 1 million words. I daresay the notion that everything is “correct” there is… unrealistic.
I can even corroborate that notion by pointing out that Eliezer is genetically human; and no human being is immune to the various cognitive biases and other failure modes of rationality; ergo even the best of us will be incorrect on a topic we have established expertise in on ocassion. Even if we assume it happens less frequently in Eliezer than in any other expert in any other topic, I find the notion that there are no errors in a body of work more than twice the length of The Lord of the Rings is one that I assign a vanishingly small probability of being accurate to.
Especially since the Sequences were written as an exercise in “just getting a post out of the door” instead of spending a long time thinking about and revising each post.
This is a really evocative phrasing that helps the point a lot. I’m updating my position accordingly. There’s an extremely high probability that something of the length of the sequences has at least a few things wrong with it. That that probability is less than the probability that there’s a mistake in at least one of someone’e beliefs shouldn’t be that relevant because the underlying probability is still extremely high.
Mind posting it?
Summary of the major stuff I think Eliezer is wrong on:
Everything Eliezer wrote about blue tentacles is wrong.
Eliezer’s use of phlogiston as an example of a bad hypothesis shows lack of historical knowledge about what was actually believed. Phlogiston was rejected by and large because it had been falsified. So claiming that it was unfalsifiable is incorrect. It is true that some people (especially Joseph Priestly) tried to add additional ad hoc hypotheses to prevent its falsification but they were a tiny minority.
Eliezer drastically overestimates the difference between “traditional rationalism” and “extreme rationalism”.
Eliezer underestimates how many physicists take MWI seriously, yet at the same time ignores that many people who have thought about the same issues as he has and know a lot more about it than he does have not accepted it.
Eliezer’s negative opinion about academia is by and large inaccurate and unjustified and to some extent seems to extend from stereotypes of it that aren’t really accurate and his own lack of experience with it.
Eliezer has massive nostalgia for the science and the attitudes about science from the 1960s or so that are deeply unjustified.
Eliezer massively overestimates the chance of an intelligence explosion occurring, primarily because he doesn’t take into account how difficult software optimization is and how much theoretical compsci puts limits on it, and he underestimates how much technical difficulty is involved in serious nanotech.
Can you expand on this? Which posts are you referring to?
Not posts but comments he has made. All such comments are actually pretty recent so they may be functions of recent viewpoints. There seems to be hints of this in some older remarks but 1 and 2 are recent extreme examples. Curiously, judging from the karma, a lot of the community disagreed with Eliezer on the first claim but a lot agreed with him on the second.
Ok, I give. Where does Eliezer talk about blue tentacles?
In Some Claims Are Just Too Extraordinary. I’m not sure if he still believes this, though, since it seems to contradict the spirit of How To Convince Me That 2 + 2 = 3.
Note that the original whole “blue tentacle” thing is from A Technical Explanation of Technical Explanation.
Thanks, I’d forgotten about that.
There’s nothing wrong with this discussion as such, but it is of no practical relevance. Regardless of whether or not errors were written into the sequences, errors are read out of them. I’ve been surprised and frustrated by people’s stupidity, such as the mass misunderstanding of the Allais paradox or dust specks, but I probably am misinterpreting something that would be obvious to someone smarter. This might even be on an issue where I am right and only wrongly think I disagree with others.
At least once Eliezer has posted a factual error and been corrected and said “oops”.
Sure. My argument wasn’t that there isn’t things wrong with the Sequences but that it wasn’t completely unreasonable to to think that there would be no mistakes in a work of that length. I already stated in my post that I thought there were problems with what Eliezer has said. But in any event, see my reply to Logos where he convinced that it is still extremely unlikely for something of this length to not have mistakes.
I think it really is rather unreasonable. Take a human, no matter how smart or rational, and have them write one blog post per day for several years, much of which is on debated topics, and I would be shocked if nothing that they said turned out to be false. Even given the relative smallness of EY’s beliefs in the sequences/EY’s beliefs in general, it’s still rather unlikely. Every one of his mistakes would have had to have been regarding something other than what he posted about, which is a bit much for my inner skeptic.
It is more reasonable that you have no errors in a smaller sampling size of your beliefs than in a larger sampling size, but the probability of there being at least one error increases with the size of the beliefs being sampled.
I don’t know that you and JoshuaZ are really disagreeing with one another so much as you are taking alternate perspectives on the same set of data.
I’m having trouble parsing the phrase “ready for rationality.”
Cheesy pathos, I agree. (And an obscure reference to Babylon 5.)
(Note that whole math textbooks can be essentially correct. Minor errors can usually be corrected without affecting anything else.)
While this is true, most math [1] textbooks generally don’t provide verbose treatments of controversial, unresolved, possibly untestable meta-problems [2], (where the validity of the conclusions crucially depend on previous controversial, unresolved, possibly untestable meta-problems.)
[1] String theory textbooks provide a possible anti-example.
[2] Metaphysics, metacognition, metaprogramming.
I can assure you that the maths in a string theory textbook will still be essentially correct.
No, it’s not too dark, it is useful to see an even stronger expression of caution. But, it misses the point a bit. It’s not very helpful to know that Eliezer is probably wrong on some things. Neither is finding a mistake here or there. It just doesn’t help.
You see, my goal is to accept and learn fully that which is accurate, and reject (and maybe fix and improve) that which is wrong. Neither one is enough by itself.
How about accepting that some things are neither, but you still have to make a choice? (E.g. inevitability of (u)FAI is untestable, and relies on a number of disputed assumptions and extrapolations. Same with the viability of cryonics.) How do you construct your priors to make a decision you can live with, and how do you deal with the situation where, despite your best priors, you end up being proven wrong?
Now, this is a much better question! And yes, I am thinking a lot on these. But, in some sense this kind of thing bothers me much less: because it is so clear that the issue is unclear, my mind doesn’t try to unconditionally commit it to the belief pool just because I read something exciting about it. And then I know I have to think about it, and look for independent sources etc. (For these two specific problems, I am in a different state of confusion. Cryonics: quite confused; AGI: a bit better, at least I know what my next steps are.)
How do you deal with this?
Out of curiosity, what do you disagree with him on?
I commented on MWI once or twice or a dozen times here, a subject dear to my heart, with little interest from the regulars. There are some other topics I mentioned in passing, but not worth getting into here.
This seems to be a straw-man. Has anyone ever asserted the infallibility of everything Eliezer has posted? Not even the Pope has that going (he only has an infallible hat he can put on), and it seems to be contradicted many times in the posts themselves with notes of edits made. Everything Eliezer has posted being right is substantially less probable than the least likely thing he has posted.
But everything Eliezer has posted doesn’t have to be right for there to be much of value, or to rely with some confidence in a particular assertion being right (particularly after you have read the arguments behind it) - and some of what is written here is fairly uncontroversial.
A straw man is a component of an argument and is an informal fallacy based on misrepresentation of an opponent’s position, twisting his words or by means of [false] assumptions.. Please point out where I have misrepresented Klao’s position. If anything, you are misrepresenting mine, as I never made any of the claims you refuted.