The original post by Omnizoid spent the first third ranting about their journey discovering that Yud was a liar and a fraud, very carefully worded to optimize for appeal to ordinary EAforum users, and didn’t back up any of their claims until the second two thirds, which were mostly esoteric consciousness arguments and wrong decision theory. What ultimately happened was that 95% of readers only read the slander that amde up the first third, and not any of the difficult-to-read arguments that Omnizoid routinely implied would back them up. People line up to do things like that.
That was what happened, and the impression I got from Yud’s response was that he wasn’t really sure whether to engage with it at all, since it might incentivize more people to take a similar strategy in the future. I also was confused by Yud calling it a “basic rule of epistemic conduct”, but if that seemed like a good way to mitigate the harm while it was ongoing, then that was his call.
From my perspective, when Yud’s reputation is bad, it’s basically impossible to convince important and valuable people to read The Sequences and HPMOR and inoculate themselves against a changing world of cognitive attacks, whereas it’s a downhill battle to convince them to read those texts when Yud’s reputation is good. So if he fumbled on the task of mitigating the damage from Omnizoid’s bad faith attacks, then yes, that’s too bad. But he was also under time constraints, since it was an ongoing situation where it was getting tons of attention on EAforum and the harm increased with every passing hour, requiring a speedy response; if you want to judge how bad/careless the fumble was, then you have to understand the full context of the situation that was actually taking place.
Certainly omnizoid’s post was bad—there’s hardly any disputing that. The ratio of snark to content was egregious, the snark itself was not justified by the number and type of the examples, and the actual examples were highly questionable at best. I think that most folks in this discussion basically agree on this.
(I, personally, also think that omnizoid’s general claim—that Eliezer is “frequently, confidently, egregiously wrong”—is false, regardless of how good or bad are any particular arguments for said claim. On this there may be less general agreement, I am not sure.)
The question before us right now concerns, specifically, whether “put the object-level refutation first, then make comments about the target’s character” is, or is not, a “basic rule of epistemic conduct”. This is a narrow point—an element of “local validity”. As such, the concerns you mention do not bear on it.
I didn’t say Eliezer was a liar and a fraud. I said he was often overconfident and eggregiously wrong, and explicitly described him as an interesting thinker who was worth reading.
This is an amazingly clarifying comment!! Thanks for writing it!
I also was confused by Yud calling it a “basic rule of epistemic conduct”, but if that seemed like a good way to mitigate the harm while it was ongoing, then that was his call. [...] when Yud’s reputation is bad, it’s basically impossible to convince important and valuable people to read The Sequences and HPMOR and inoculate themselves against a changing world of cognitive attacks, whereas it’s a downhill battle to convince them to read those texts when Yud’s reputation is good
I want people to read the Sequences because they teach what kinds of thinking systematically lead to beliefs that reflect reality. I’m opposed to people making misleading claims about what kinds of thinking systematically lead to beliefs that reflect reality (in this case, positing a “basic rule of epistemic conduct” that isn’t one) just because it seems like a good way to mitigate ongoing harm to their reputation. That’s not what the Sequences say to do!
I agree that Omnizoid’s post was bad. (At a minimum, the author failed to understand what problems timeless decision theory is trying to solve.) But I don’t want to call it “bad faith” without simultaneously positing that the author has some particular motive or hidden agenda that they’re not being forthcoming about.
What motive would that be, specifically? I think the introduction was pretty forthcoming about why Omnizoid wanted to damage Yudkowsky’s reputation (“Part of this is caused by personal irritation”, “But a lot of it is that Yudkowsky has the ear of many influential people”, “Eliezer’s influence is responsible for a narrow, insular way of speaking among effective altruists”, “Eliezer’s views have undermined widespread trust in experts”). I don’t see any particular reason to doubt that story.
I think that in this context “bad faith” means that he has a motive to damage Yudkowsky’s reputation aside from the one stated in the article (that he thinks Yudkowsky made a specific mistake at a specific point).
The context matters here.
The original post by Omnizoid spent the first third ranting about their journey discovering that Yud was a liar and a fraud, very carefully worded to optimize for appeal to ordinary EAforum users, and didn’t back up any of their claims until the second two thirds, which were mostly esoteric consciousness arguments and wrong decision theory. What ultimately happened was that 95% of readers only read the slander that amde up the first third, and not any of the difficult-to-read arguments that Omnizoid routinely implied would back them up. People line up to do things like that.
That was what happened, and the impression I got from Yud’s response was that he wasn’t really sure whether to engage with it at all, since it might incentivize more people to take a similar strategy in the future. I also was confused by Yud calling it a “basic rule of epistemic conduct”, but if that seemed like a good way to mitigate the harm while it was ongoing, then that was his call.
From my perspective, when Yud’s reputation is bad, it’s basically impossible to convince important and valuable people to read The Sequences and HPMOR and inoculate themselves against a changing world of cognitive attacks, whereas it’s a downhill battle to convince them to read those texts when Yud’s reputation is good. So if he fumbled on the task of mitigating the damage from Omnizoid’s bad faith attacks, then yes, that’s too bad. But he was also under time constraints, since it was an ongoing situation where it was getting tons of attention on EAforum and the harm increased with every passing hour, requiring a speedy response; if you want to judge how bad/careless the fumble was, then you have to understand the full context of the situation that was actually taking place.
Certainly omnizoid’s post was bad—there’s hardly any disputing that. The ratio of snark to content was egregious, the snark itself was not justified by the number and type of the examples, and the actual examples were highly questionable at best. I think that most folks in this discussion basically agree on this.
(I, personally, also think that omnizoid’s general claim—that Eliezer is “frequently, confidently, egregiously wrong”—is false, regardless of how good or bad are any particular arguments for said claim. On this there may be less general agreement, I am not sure.)
The question before us right now concerns, specifically, whether “put the object-level refutation first, then make comments about the target’s character” is, or is not, a “basic rule of epistemic conduct”. This is a narrow point—an element of “local validity”. As such, the concerns you mention do not bear on it.
I dispute that . . .
I think that “the author of the post does not think the post he wrote was bad” is quite sufficiently covered by “hardly any”.
Yeah, I was just kidding!
I didn’t say Eliezer was a liar and a fraud. I said he was often overconfident and eggregiously wrong, and explicitly described him as an interesting thinker who was worth reading.
This is an amazingly clarifying comment!! Thanks for writing it!
I want people to read the Sequences because they teach what kinds of thinking systematically lead to beliefs that reflect reality. I’m opposed to people making misleading claims about what kinds of thinking systematically lead to beliefs that reflect reality (in this case, positing a “basic rule of epistemic conduct” that isn’t one) just because it seems like a good way to mitigate ongoing harm to their reputation. That’s not what the Sequences say to do!
What do you mean by “bad faith” in this context? Following Wikipedia, I understand bad faith to mean “a sustained form of deception which consists of entertaining or pretending to entertain one set of feelings while acting as if influenced by another”—basically, when the stated reasons aren’t the real reasons. (I recently wrote a post about why I don’t find the term that useful, because I think it’s common for the stated reasons to not be the real reasons, rather than a rare deviation.)
I agree that Omnizoid’s post was bad. (At a minimum, the author failed to understand what problems timeless decision theory is trying to solve.) But I don’t want to call it “bad faith” without simultaneously positing that the author has some particular motive or hidden agenda that they’re not being forthcoming about.
What motive would that be, specifically? I think the introduction was pretty forthcoming about why Omnizoid wanted to damage Yudkowsky’s reputation (“Part of this is caused by personal irritation”, “But a lot of it is that Yudkowsky has the ear of many influential people”, “Eliezer’s influence is responsible for a narrow, insular way of speaking among effective altruists”, “Eliezer’s views have undermined widespread trust in experts”). I don’t see any particular reason to doubt that story.
I think that in this context “bad faith” means that he has a motive to damage Yudkowsky’s reputation aside from the one stated in the article (that he thinks Yudkowsky made a specific mistake at a specific point).
My goal was to get people to defer to Eliezer. I explicitly say he’s an interesting thinker who is worth reading.
From my point of view, there’s nothing irreplaceable about HPMOR and The Sequences. Science’n’logic have been around for a long time