Is there a way to summarize this shortly? Eliezer disagreed with you about something, or maybe you just interpreted something he wrote as a disagreement with you… and now your soul can’t find peace until he admits that he was wrong and you were right about things that are too meta for me to understand wtf you are talking about...
Here’s an attempt.
Sometimes people have expectations of each other, like “you won’t steal objects from my house”. Those expectations get formed by both explicit and implicit promises. Violating those expectations is often a big deal, not just to the injured party but also to third parties—someone who stole from Alice might well steal from you, too.
To the extent this community encouraged expectations of each other, they were about core epistemic virtues and discussion practices. People will try to ensure their beliefs are consistent with their other beliefs; they won’t say things without believing them; they’ll share evidence when they can; when they are bound to be uncooperative, they at least explain how and why they’ll be uncooperative, and so on.
[For example, I keep secrets because I think information can be owned, even tho this is cooperative with the information-owner and not with the information-wanter.]
So “Eliezer disagreed with you about something” is an understatement; disagreement is fine, expected even! The thing was that instead of having a regular disagreement in the open, Zack saw Eliezer as breaking a lot of these core expectations, not being open about it or acknowledging it when being called out, and also others not reacting to Eliezer breaking those expectations. (If Eliezer had punched Zack, people would probably have thought that was shocking and criticized it, but this was arguably worse given the centrality of these expectations to Eliezer’s prominence and yet people were reacting less.)
That said, the promises were (I think) clearly aspirational / mediated by the pressures of having to actually exist in the world. I do think it makes sense to have a heresy budget, and I think Zack got unlucky with the obsession lottery. I think if people had originally said to Zack “look, we’re being greengrocers on your pet issue, sorry about throwing you to the wolves” he would have been sad but moved on; see his commentary on the 2013 disavowal.
Instead they made philosophical arguments that, as far as I can tell, were not correct, and this was crazy-making, because Zack now also doubted his reasoning that led to him disagreeing with them, but no one would talk about this publicly. (Normally if Zack was making a mistake, people could just point to the mistake, and then he could fix the upstream generator of that mistake and everyone could move on.) And, also, to the extent that they generalized their own incorrect justifications to reasoning about other fields, this was making them crazy, in a way that should have alarmed third parties who were depending on their reasoning. The disinterest of those third parties was itself also expectation-violating.
[I don’t think I was ever worried about this bleeding over into reasoning about other things; I probably would have joined the conversation more actively if I had? I do regret not asking people what their strategy was back in ~2019; the only people I remember talking to about this were Zack and the LW team.]
I do want to… disagree? quibble? (I am not actually sure how to characterize this)… on one bit, though:
I do think it makes sense to have a heresy budget
I agree that it makes sense to have a heresy budget, but I think that it’s important to distinguish between heresies that directly affect you and/or other people in your own community, and heresies that you can “safely”[1] ignore.
For example, suppose that I disagree with the mainstream consensus on climate change. But I, personally, cannot do anything to affect government policy related to climate change, or otherwise alter how society treats the issue. Maybe our community as a whole can have some effect on such things… but probably not. And there’s nothing to be done about it on an individual basis. So if I, and the rest of the rationalist community, mostly avoids talking about the subject (and, if forced to discuss it, we mouth the necessary platitudes and quickly change the subject), then relatively little is lost.
Now suppose that the subject is something like… distortions in reporting, by municipal governments, of violent crime statistics. Getting the wrong answer on a question like that might expose you and your family to significant personal danger, so it’s important to get the right answer. On the other hand, there’s nothing special about rationalists that makes this a more important question for us than for anyone else. On the third hand, maybe we’re unusually well-positioned to get such questions right. Still, the question of whether this should be part of our “heresy budget” is not clear-cut.
(But see COVID for an example of a question in this latter category which we did choose to include in our “heresy budget”. Of course, it wasn’t a very severe heresy, and maybe that’s part of why we were able to take that stance toward it. In any case, it worked out fairly well for us, yes?)
Finally, suppose that the subject is something like homeschooling, or child education more generally. Not only is this question highly personal for anyone who has children, but it’s substantially more likely to be relevant to people in the rationalist community than for the general population, due to the prevalence in said community of a variety of (heritable!) personality traits. Getting questions in this domain wrong is unusually likely, for us, to result in inflicting substantial suffering on our children. Quite reasonably, therefore, this is solidly within our “heresy budget”.
It seems clear that trans issues fall into the third category…
… or they should, at least. But that’s not how the rest of the rationalist community sees it, as Zack has discovered. That is, at the least, somewhat odd.
(Note that it does not suffice to say “actually, the mainstream consensus on trans issues is correct, so there is nothing to be heretical about”—since the heresy seems to consist not only of reaching some dissenting conclusion, but also of treating various relevant questions as open in the first place!)
Finally, suppose that the subject is something like homeschooling, or child education more generally. Not only is this question highly personal for anyone who has children, but it’s substantially more likely to be relevant to people in the rationalist community than for the general population, due to the prevalence in said community of a variety of (heritable!) personality traits. Getting questions in this domain wrong is unusually likely, for us, to result in inflicting substantial suffering on our children. Quite reasonably, therefore, this is solidly within our “heresy budget”.
It seems clear that trans issues fall into the third category…
… or they should, at least. But that’s not how the rest of the rationalist community sees it, as Zack has discovered. That is, at the least, somewhat odd.
I mean, this is probably correct. But my problem is that despite finding a lot of Zack’s claims on this topic in the past quite reasonable, I find the discussion over the last year or two exhausting to engage with. This post is 20k+ words alone! I’m not reading that. And there’s no article I know of which a reasonably good summary of what the heck is going on. So I’m not observing what Zack’s saying, let alone deciding and acting. Right now, I’m struggling to orient.
By the way, thank you for writing this comment. Same goes for @tailcalled and @Vaniver’s comments to this post. If only @Zack_M_Davis would write posts as concise!
EDIT: Changed Zack_D to Zack_M_Davis bc of Rafael Hearth’s correct reponse that Zack_D has not written any posts longer than 2k words.
To some degree, litigating deceptive behavior from Eliezer and Scott is just inherently going to be exhausting because it’s most in their interest to make the deception confusing.
Here’s an attempt.
Sometimes people have expectations of each other, like “you won’t steal objects from my house”. Those expectations get formed by both explicit and implicit promises. Violating those expectations is often a big deal, not just to the injured party but also to third parties—someone who stole from Alice might well steal from you, too.
To the extent this community encouraged expectations of each other, they were about core epistemic virtues and discussion practices. People will try to ensure their beliefs are consistent with their other beliefs; they won’t say things without believing them; they’ll share evidence when they can; when they are bound to be uncooperative, they at least explain how and why they’ll be uncooperative, and so on.
[For example, I keep secrets because I think information can be owned, even tho this is cooperative with the information-owner and not with the information-wanter.]
So “Eliezer disagreed with you about something” is an understatement; disagreement is fine, expected even! The thing was that instead of having a regular disagreement in the open, Zack saw Eliezer as breaking a lot of these core expectations, not being open about it or acknowledging it when being called out, and also others not reacting to Eliezer breaking those expectations. (If Eliezer had punched Zack, people would probably have thought that was shocking and criticized it, but this was arguably worse given the centrality of these expectations to Eliezer’s prominence and yet people were reacting less.)
That said, the promises were (I think) clearly aspirational / mediated by the pressures of having to actually exist in the world. I do think it makes sense to have a heresy budget, and I think Zack got unlucky with the obsession lottery. I think if people had originally said to Zack “look, we’re being greengrocers on your pet issue, sorry about throwing you to the wolves” he would have been sad but moved on; see his commentary on the 2013 disavowal.
Instead they made philosophical arguments that, as far as I can tell, were not correct, and this was crazy-making, because Zack now also doubted his reasoning that led to him disagreeing with them, but no one would talk about this publicly. (Normally if Zack was making a mistake, people could just point to the mistake, and then he could fix the upstream generator of that mistake and everyone could move on.) And, also, to the extent that they generalized their own incorrect justifications to reasoning about other fields, this was making them crazy, in a way that should have alarmed third parties who were depending on their reasoning. The disinterest of those third parties was itself also expectation-violating.
[I don’t think I was ever worried about this bleeding over into reasoning about other things; I probably would have joined the conversation more actively if I had? I do regret not asking people what their strategy was back in ~2019; the only people I remember talking to about this were Zack and the LW team.]
I think this is a pretty good summary.
I do want to… disagree? quibble? (I am not actually sure how to characterize this)… on one bit, though:
I agree that it makes sense to have a heresy budget, but I think that it’s important to distinguish between heresies that directly affect you and/or other people in your own community, and heresies that you can “safely”[1] ignore.
For example, suppose that I disagree with the mainstream consensus on climate change. But I, personally, cannot do anything to affect government policy related to climate change, or otherwise alter how society treats the issue. Maybe our community as a whole can have some effect on such things… but probably not. And there’s nothing to be done about it on an individual basis. So if I, and the rest of the rationalist community, mostly avoids talking about the subject (and, if forced to discuss it, we mouth the necessary platitudes and quickly change the subject), then relatively little is lost.
Now suppose that the subject is something like… distortions in reporting, by municipal governments, of violent crime statistics. Getting the wrong answer on a question like that might expose you and your family to significant personal danger, so it’s important to get the right answer. On the other hand, there’s nothing special about rationalists that makes this a more important question for us than for anyone else. On the third hand, maybe we’re unusually well-positioned to get such questions right. Still, the question of whether this should be part of our “heresy budget” is not clear-cut.
(But see COVID for an example of a question in this latter category which we did choose to include in our “heresy budget”. Of course, it wasn’t a very severe heresy, and maybe that’s part of why we were able to take that stance toward it. In any case, it worked out fairly well for us, yes?)
Finally, suppose that the subject is something like homeschooling, or child education more generally. Not only is this question highly personal for anyone who has children, but it’s substantially more likely to be relevant to people in the rationalist community than for the general population, due to the prevalence in said community of a variety of (heritable!) personality traits. Getting questions in this domain wrong is unusually likely, for us, to result in inflicting substantial suffering on our children. Quite reasonably, therefore, this is solidly within our “heresy budget”.
It seems clear that trans issues fall into the third category…
… or they should, at least. But that’s not how the rest of the rationalist community sees it, as Zack has discovered. That is, at the least, somewhat odd.
(Note that it does not suffice to say “actually, the mainstream consensus on trans issues is correct, so there is nothing to be heretical about”—since the heresy seems to consist not only of reaching some dissenting conclusion, but also of treating various relevant questions as open in the first place!)
In the “they have not, in fact, come for me (yet)” sense of “safely”, at least.
I mean, this is probably correct. But my problem is that despite finding a lot of Zack’s claims on this topic in the past quite reasonable, I find the discussion over the last year or two exhausting to engage with. This post is 20k+ words alone! I’m not reading that. And there’s no article I know of which a reasonably good summary of what the heck is going on. So I’m not observing what Zack’s saying, let alone deciding and acting. Right now, I’m struggling to orient.
By the way, thank you for writing this comment. Same goes for @tailcalled and @Vaniver’s comments to this post. If only @Zack_M_Davis would write posts as concise!
EDIT: Changed Zack_D to Zack_M_Davis bc of Rafael Hearth’s correct reponse that Zack_D has not written any posts longer than 2k words.
To some degree, litigating deceptive behavior from Eliezer and Scott is just inherently going to be exhausting because it’s most in their interest to make the deception confusing.
To be fair, @Zack_D hasn’t written any posts longer than 2000 words!