I appreciate the object-level responses this post made and think it’s good to poke at various things Eliezer has said (and also think Eliezer is wrong about a bunch of stuff, including the animal consciousness example in the post). In contrast, I find the repeated assertions of “gross overconfidence” and associated snarkiness annoying, and in many parts of the post the majority of the text seems to be dedicated to repeated statements of outrage with relatively little substance (Eliezer also does this sometimes, and I also find it somewhat annoying in his case, though I haven’t seen any case where he does it this much).
I spent quite a lot of time thinking about all three of these questions, and I currently think the arguments this post makes seem to misunderstand Eliezer’s arguments for the first two, and also get the wrong conclusions on both of them.
For the third one, I disagree with Eliezer, but also, it’s a random thing that Eliezer has said once on Facebook and Twitter, that he hasn’t argued for. Maybe he has good arguments for it, I don’t know. He never claimed anyone else should be convinced by the things he has written up, and I personally don’t understand consciousness or human values well enough to have much of any confident take here. My current best guess is that Eliezer is wrong here, and I would be interested in him seeing him write up his takes, but most of the relevant section seems to boil down to repeatedly asserting that Eliezer has made no arguments for his position, when like, yeah, that’s fine, I don’t see that as a problem. I form most of my beliefs without making my arguments legible to random people on the internet.
Yeah I can see how that could be annoying. In my defense, however, I am seriously irritated by this and I think there’s nothing wrong with being a big snarky sometimes. Eliezer seemed to think in this FaceBook exchange that his view just falls naturally from understanding consciousness. But that is a very specific and implausible model.
I would be interested in your actual defense of the first two sections. It seems the OP went to great lengths to explain exactly where Eliezer went wrong, and contrasted Eliezer’s beliefs with citations to actual, respected domain level experts.
I also do not understand your objection to the term “gross overconfidence”. I think the evidence provided by the OP is completely sufficient to substantiate this claim. In all three cases (and many more I can think of that are not mentioned here), Eliezer has stated things that are probably incorrect, and then dismissively attacked, in an incredibly uncharitable manner, people who believe the opposite claims. “Eliezer is often grossly overconfident” is, in my opinion, a true claim that has been supported with evidence. I do not think charitability requires one to self-censor such a statement.
For the first one, I found Eliezer’s own response reasonable comprehensive.
For the second one, I feel like this topic has been very extensively discussed on the site, and I don’t really want to reiterate all of that discussion. See the FDT tag.
Eliezers response is not comprehensive. He responds to two points (a reasonable choice), but he responds badly, first with a strawman, second with an argument that is probably wrong.
The first point he argues is about brain efficiency, and is not even a point made by the OP. The OP was simply citing someone else, to show that “Eliezer is overconfident about my area of expertise” is an extremely common opinion. It feels very weird to attack the OP over citing somebody else’s opinion.
Regardless, Eliezer handles this badly anyway. Eliezer gives a one paragraph explanation of why brain efficiency is not close to tha Landauer limit. Except that If we look at the actual claim that is quoted, Jacob is not saying that it is at the limit, only that it’s not six orders of magnitude away from the limit, which was Eliezer’s original claim. So essentially he debunks a strawman position and declares victory. (I do not put any trust in Eliezers opinions on neuroscience)
When it comes to the zombies, I’ll admit to finding his argument fairly hard to follow. The accusation levelled against him, both by the OP and Chalmers, is that he falsely equates debunking epiphenomenalism with debunking the zombie argument as a whole.
Eliezer unambiguously does equate the two things, as proven by the following quote highlighted by the OP:
It seems to me that there is a direct, two-way logical entailment between “consciousness is epiphenomenal” and “zombies are logically possible”
The following sentence, from the comment, seems (to me) to be a contradiction of his earlier claim.
It’s not that I think philosophers openly claim that p-zombies demonstrate epiphenomenalism
The most likely explanation, to me, is that Eliezer made a mistake, the OP and Chalmers pointed it out, and then he tried to pretend it didn’t happen. I’m not certain this is what happened (as the zombies stuff is highly confusing), but it’s entirely in line with Eliezer’s behavior over the years.
I think Eliezer has a habit of barging into other peoples domains, making mistakes, and then refusing to be corrected by people that actually know what they are talking about, acting rude and uncharitable in the process.
Imagine someone came up to you on the street and claimed to know better than the experts in quantum physics, and nanoscience, and AI research, and ethics, and philosophy of mind, and decision theory, and economic theory, and nutrition, and animal consciousness, and statistics and philosophy of science, and epistemology and virology and cryonics.
What odds would you place on such a person being overconfident about their own abilities?
I appreciate the object-level responses this post made and think it’s good to poke at various things Eliezer has said (and also think Eliezer is wrong about a bunch of stuff, including the animal consciousness example in the post). In contrast, I find the repeated assertions of “gross overconfidence” and associated snarkiness annoying, and in many parts of the post the majority of the text seems to be dedicated to repeated statements of outrage with relatively little substance (Eliezer also does this sometimes, and I also find it somewhat annoying in his case, though I haven’t seen any case where he does it this much).
I spent quite a lot of time thinking about all three of these questions, and I currently think the arguments this post makes seem to misunderstand Eliezer’s arguments for the first two, and also get the wrong conclusions on both of them.
For the third one, I disagree with Eliezer, but also, it’s a random thing that Eliezer has said once on Facebook and Twitter, that he hasn’t argued for. Maybe he has good arguments for it, I don’t know. He never claimed anyone else should be convinced by the things he has written up, and I personally don’t understand consciousness or human values well enough to have much of any confident take here. My current best guess is that Eliezer is wrong here, and I would be interested in him seeing him write up his takes, but most of the relevant section seems to boil down to repeatedly asserting that Eliezer has made no arguments for his position, when like, yeah, that’s fine, I don’t see that as a problem. I form most of my beliefs without making my arguments legible to random people on the internet.
Yeah I can see how that could be annoying. In my defense, however, I am seriously irritated by this and I think there’s nothing wrong with being a big snarky sometimes. Eliezer seemed to think in this FaceBook exchange that his view just falls naturally from understanding consciousness. But that is a very specific and implausible model.
I would be interested in your actual defense of the first two sections. It seems the OP went to great lengths to explain exactly where Eliezer went wrong, and contrasted Eliezer’s beliefs with citations to actual, respected domain level experts.
I also do not understand your objection to the term “gross overconfidence”. I think the evidence provided by the OP is completely sufficient to substantiate this claim. In all three cases (and many more I can think of that are not mentioned here), Eliezer has stated things that are probably incorrect, and then dismissively attacked, in an incredibly uncharitable manner, people who believe the opposite claims. “Eliezer is often grossly overconfident” is, in my opinion, a true claim that has been supported with evidence. I do not think charitability requires one to self-censor such a statement.
For the first one, I found Eliezer’s own response reasonable comprehensive.
For the second one, I feel like this topic has been very extensively discussed on the site, and I don’t really want to reiterate all of that discussion. See the FDT tag.
Eliezers response is not comprehensive. He responds to two points (a reasonable choice), but he responds badly, first with a strawman, second with an argument that is probably wrong.
The first point he argues is about brain efficiency, and is not even a point made by the OP. The OP was simply citing someone else, to show that “Eliezer is overconfident about my area of expertise” is an extremely common opinion. It feels very weird to attack the OP over citing somebody else’s opinion.
Regardless, Eliezer handles this badly anyway. Eliezer gives a one paragraph explanation of why brain efficiency is not close to tha Landauer limit. Except that If we look at the actual claim that is quoted, Jacob is not saying that it is at the limit, only that it’s not six orders of magnitude away from the limit, which was Eliezer’s original claim. So essentially he debunks a strawman position and declares victory. (I do not put any trust in Eliezers opinions on neuroscience)
When it comes to the zombies, I’ll admit to finding his argument fairly hard to follow. The accusation levelled against him, both by the OP and Chalmers, is that he falsely equates debunking epiphenomenalism with debunking the zombie argument as a whole.
Eliezer unambiguously does equate the two things, as proven by the following quote highlighted by the OP:
The following sentence, from the comment, seems (to me) to be a contradiction of his earlier claim.
The most likely explanation, to me, is that Eliezer made a mistake, the OP and Chalmers pointed it out, and then he tried to pretend it didn’t happen. I’m not certain this is what happened (as the zombies stuff is highly confusing), but it’s entirely in line with Eliezer’s behavior over the years.
I think Eliezer has a habit of barging into other peoples domains, making mistakes, and then refusing to be corrected by people that actually know what they are talking about, acting rude and uncharitable in the process.
Imagine someone came up to you on the street and claimed to know better than the experts in quantum physics, and nanoscience, and AI research, and ethics, and philosophy of mind, and decision theory, and economic theory, and nutrition, and animal consciousness, and statistics and philosophy of science, and epistemology and virology and cryonics.
What odds would you place on such a person being overconfident about their own abilities?