This is an amazingly clarifying comment!! Thanks for writing it!
I also was confused by Yud calling it a “basic rule of epistemic conduct”, but if that seemed like a good way to mitigate the harm while it was ongoing, then that was his call. [...] when Yud’s reputation is bad, it’s basically impossible to convince important and valuable people to read The Sequences and HPMOR and inoculate themselves against a changing world of cognitive attacks, whereas it’s a downhill battle to convince them to read those texts when Yud’s reputation is good
I want people to read the Sequences because they teach what kinds of thinking systematically lead to beliefs that reflect reality. I’m opposed to people making misleading claims about what kinds of thinking systematically lead to beliefs that reflect reality (in this case, positing a “basic rule of epistemic conduct” that isn’t one) just because it seems like a good way to mitigate ongoing harm to their reputation. That’s not what the Sequences say to do!
I agree that Omnizoid’s post was bad. (At a minimum, the author failed to understand what problems timeless decision theory is trying to solve.) But I don’t want to call it “bad faith” without simultaneously positing that the author has some particular motive or hidden agenda that they’re not being forthcoming about.
What motive would that be, specifically? I think the introduction was pretty forthcoming about why Omnizoid wanted to damage Yudkowsky’s reputation (“Part of this is caused by personal irritation”, “But a lot of it is that Yudkowsky has the ear of many influential people”, “Eliezer’s influence is responsible for a narrow, insular way of speaking among effective altruists”, “Eliezer’s views have undermined widespread trust in experts”). I don’t see any particular reason to doubt that story.
I think that in this context “bad faith” means that he has a motive to damage Yudkowsky’s reputation aside from the one stated in the article (that he thinks Yudkowsky made a specific mistake at a specific point).
This is an amazingly clarifying comment!! Thanks for writing it!
I want people to read the Sequences because they teach what kinds of thinking systematically lead to beliefs that reflect reality. I’m opposed to people making misleading claims about what kinds of thinking systematically lead to beliefs that reflect reality (in this case, positing a “basic rule of epistemic conduct” that isn’t one) just because it seems like a good way to mitigate ongoing harm to their reputation. That’s not what the Sequences say to do!
What do you mean by “bad faith” in this context? Following Wikipedia, I understand bad faith to mean “a sustained form of deception which consists of entertaining or pretending to entertain one set of feelings while acting as if influenced by another”—basically, when the stated reasons aren’t the real reasons. (I recently wrote a post about why I don’t find the term that useful, because I think it’s common for the stated reasons to not be the real reasons, rather than a rare deviation.)
I agree that Omnizoid’s post was bad. (At a minimum, the author failed to understand what problems timeless decision theory is trying to solve.) But I don’t want to call it “bad faith” without simultaneously positing that the author has some particular motive or hidden agenda that they’re not being forthcoming about.
What motive would that be, specifically? I think the introduction was pretty forthcoming about why Omnizoid wanted to damage Yudkowsky’s reputation (“Part of this is caused by personal irritation”, “But a lot of it is that Yudkowsky has the ear of many influential people”, “Eliezer’s influence is responsible for a narrow, insular way of speaking among effective altruists”, “Eliezer’s views have undermined widespread trust in experts”). I don’t see any particular reason to doubt that story.
I think that in this context “bad faith” means that he has a motive to damage Yudkowsky’s reputation aside from the one stated in the article (that he thinks Yudkowsky made a specific mistake at a specific point).
My goal was to get people to defer to Eliezer. I explicitly say he’s an interesting thinker who is worth reading.