I read the first half and kind of zoned out—I wish that the author had shown any examples of communities lacking such problems, to contrast EA against.
How do you expect journalism to work? The author is trying to contribute one specific story, in detail. Readers have other experiences to compare and draw from. If this was an academic piece, I might be more sympathetic.
The core thesis of the post seems to rely on the level of abuse in this community being substantially higher than in other communities (the last sentence seems to make that pretty explicit). I think if you want to compellingly argue for your thesis you should provide the best evidence you have for that thesis. Journalism commonly being full of fallacious reasoning doesn’t mean that it’s good or forgivable for journalism to reason fallaciously.
I do think journalists from time to time summarizing and distilling concrete data is good, but in that case people still clearly benefit if the data is presented in a relatively unbiased way that doesn’t distort the underlying truth a lot, or omits crucial pieces of information that the journalist very likely knew but didn’t contribute to their narrative. I think journalists not doing that is condemnable and the resulting articles are rarely worth reading.
I don’t think the core thesis is “the level of abuse in this community is substantially higher than in others”. Even if we (very generously) just assumed that the level of abuse in this community was lower than that in most places, these incidents would still be very important to bring up and address.
When an abuse of power arises the organisation/community in which it arises has roughly two possible approaches—clamping down on it or covering it up. The purpose of the first is to solve the problem, the purpose of the second is to maintain the reputation of the organisation. (How many of those catholic church child abuse stories were covered up because they were worried about the reputational damage to the church). By focusing on the relative abuse level it seem like you are seeing these stories (primarily) as an attack on the reputation of your tribe (“A blue abused someone? No he didn’t its Green propaganda!”). Does it matter whether the number of children abused in the catholic church was higher than the number abused outside it?
If that is the case, then there is nothing wrong with that emotional response. If you feel a sense of community with a group and you yourself have never experienced the problem, it can just feel like an attack on something you like. The journalist might even be motivated badly (eg. they think an editorial line against EA will go down well). But I still think its a fairly unhelpful response
Of course, one could argue that the position “Obviously deal with these issues, but also they are very rare and our tribe is actually super good” is perfectly logically consistent. And it is. But the language is doing extra work—by putting “us good” next too the issue it sounds like minimising or dismissing the issue. Put another way claims of “goodness” could be made in one post, and then left them out of the sex abuse discussion. The two are not very linked.
Does it matter whether the number of children abused in the catholic church was higher than the number abused outside it?
Yes, it does matter here, since base rates matter in general.
Honestly, one of my criticisms that I want to share as a post later on is that LW ignores the base rates and focuses too much on the inside view over the outside view, but in this case, it does matter here since the analogous claim would be that the church is uniquely bad at sexual assault, and if it turned out that it wasn’t uniquely bad, then it means we don’t have to panic.
That’s the importance of base rates: It gives you a solid number that is useful to compare against. Nothing is usually nearly as unprecedented or new as a first time person thinks.
The base-rates post sounds like an interesting one, I look forward to it. But, unless I am very confused, the base rates are only ever going to help answer questions like: “is this group of people better than society in general by metric X” (You can bring a choice Hollywood producer and Prince out as part of the control group). My point was that I think a more useful question might be something like “Why was the response to this specific incident inadequate?”.
I can see how the article might be frustrating for people who know the additional context that the article leaves out (where some of the additional context is simply having been in this community for a long time and having more insight into how it deals with abuse). From the outside though, it does feel like some factors would make abuse more likely in this community: how salient “status” feels, mixing of social and professional lives, gender ratios, conflicts of interests everywhere due to the community being small, sex positivity and acceptance of weirdness and edginess (which I think are great overall!). There are also factors pushing in the other direction of course.
I say this because it seems very reasonable for someone who is new to the community to read the article and the tone in the responses here and feel uncomfortable interacting with the community in the future. A couple of women in the past have mentioned to me that they haven’t engaged much with the in-person rationalist community because they expect the culture to be overly tolerant of bad behaviour, which seems sad because I expect them to enjoy hanging out in the community.
I can see the reasons behind not wanting to give the article more attention if it seems like a very inaccurate portrayal of things. But it does feel like that makes this community feel more unwelcoming to some newer people (especially women) who would otherwise like to be here and who don’t have the information about how the things mentioned in the article were responded to in the past.
Yeah, I might want to write a post that tries to actually outline the history of abuse that I am aware of, without doing weird rhetorical tricks or omitting information. I’ve recently been on a bit of a “let’s just put everything out there in public” spree, and I would definitely much prefer for new people to be able to get an accurate sense of the risk of abuse and harm, which, to be clear, is definitely not zero and feels substantial enough that people should care about it.
I do think the primary reason why people haven’t written up stuff in the past is exactly because they are worried their statements will get ripped out of context and used as ammunition in hit pieces like this, so I actually think articles like this make the problem worse, not better, though I am not confident of this, and the chain of indirect effects is reasonably long here.
I would be appreciative if you do end up writing such a post.
Sad that sometimes the things that seem good for creating a better, more honest, more accountable community for the people in it also give outsiders ammunition. My intuitions point strongly in the direction of doing things in this category anyway.
I don’t disagree with the main thrust of your comment, but,
I just wanna point out that ‘fallacious’ is often a midwit objection, and either ‘fallacious’ is not the true problem or it is the true problem but the stereotypes about what is fallacious do not align with reality: A Unifying Theory in Defense of Logical Fallacies
Yeah, that’s fair. I was mostly using it as a synonym for “badly reasoned and inaccurate” here. Agree that there are traps around policing speech by trying to apple rhetorical fallacies, which I wasn’t trying to do here.
Yes. Research on sexual harassment in the academy suggests that it remains a prevalent problem. In a 2003 study examining incidences of sexual harassment in the workplace across private, public, academic, and military industries, Ilies et al (2003) found academia to have the second highest rates of harassment, second only to the military. More recently, a report by the The National Academies of Sciences, Engineering, and Medicine (NASEM) summarized the persistent problem of sexual harassment in academia with regard to faculty-student harassment, as well as faculty-faculty harassment. To find more evidence of this issue, one can also turn to Twitter – as Times Higher Education highlighted in their 2019 blog.
In 2019, the Association of American Universities surveyed 33 prominent research universities and found 13% of all students experienced a form of sexual assault and 41.8% experienced sexual harassment (Cantor et al., Citation2020).
Mainstream academia is not free from sexual abuse.
It is. But if someone is saying “this group of people is notably bad” then it’s worth asking whether they’re actually worse than other broadly similar groups of people or not.
I think the article, at least to judge from the parts of it posted here, is arguing that rationalists and/or EAs are unusually bad. See e.g. the final paragraph about paperclip-maximizers.
I fail to see why it matters what other broadly similar groups of people do. Rationalists ought to predict and steer the future better than other kinds of people, and so should be held to a higher standard. Deflecting with “but all the other kids are equally abusive!” is just really stupid.
As for the article, I’m not concerned with the opinion of a journalist either; they can be confused or bombastic about the exact extent of the problem if they want, it’s rather standard for journalists; but I don’t doubt that the problem is real and hasn’t been preemptively fixed before it happened, which bothers me because the founders of this community are more than smart enough to have at least made an attempt to do so.
Whether it matters what other broadly similar groups do depends on what you’re concerned with and why.
If you’re, say, a staff member at an EA organization, then presumably you are trying to do the best you could plausibly do, and in that case the only significance of those other groups would be that if you have some idea how hard they are trying to do the best they can, it may give you some idea of what you can realistically hope to achieve. (“Group X has such-and-such a rate of sexual misconduct incidents, but I know they aren’t really trying hard; we’ve got to do much better than that.” “Group Y has such-and-such a rate of sexual misconduct incidents, and I know that the people in charge are making heroic efforts; we probably can’t do better.”)
So for people in that situation, I think your point of view is just right. But:
If you’re someone wondering whether you should avoid associating with rationalists or EAs for fear of being sexually harassed or assaulted, then you probably have some idea of how reluctant you are to associate with other groups (academics, Silicon Valley software engineers, …) for similar reasons. If it turns out that rationalists or EAs are pretty much like those, then you should be about as scared of rationalists as you are of them, regardless of whether rationalists should or could have done better.
If you’re a Less Wrong reader wondering whether these are Awful People that you’ve been associating with and you should be questioning your judgement in thinking otherwise, then again you probably have some idea of how Awful some other similar groups are. If it turns out that rationalists are pretty much like academics or software engineers, then you should feel about as bad for failing to shun them as you would for failing to shun academics or software engineers.
If you’re a random person reading a Bloomberg News article, and wondering whether you should start thinking of “rationalist” and “effective altruist” as warning signs in the same way as you might think of some other terms that I won’t specify for fear of irrelevant controversy, then once again you should be calibrating your outrage against how you feel about other groups.
For the avoidance of doubt, I should say that I don’t know how the rate of sexual misconduct among rationalists / EAs / Silicon Valley rationalists in particular / … compares with the rate in other groups, nor do I have a very good idea of how high it is in other similar groups. It could be that the rate among rationalists is exceptionally high (as the Bloomberg News article is clearly trying to make us think). It could be that it’s comparable to the rate among, say, Silicon Valley software engineers and that that rate is horrifyingly high (as plenty of other news articles would have us think). It could be that actually rationalists aren’t much different from any other group with a lot of geeky men in it, and that groups with a lot of geeky men in them are much less bad than journalists would have us believe. That last one is the way my prejudices lean … but they would, wouldn’t they?, so I wouldn’t put much weight on them.
[EDITED to add:] Oh, another specific situation one could be in that’s relevant here: If you are contemplating Reasons Why Rationalists Are So Bad (cf. the final paragraph quoted in the OP here, which offers an explanation for that), it is highly relevant whether rationalists are in fact unusually bad. If rationalists or EAs are just like whatever population they’re mostly drawn from, then it doesn’t make sense to look for explanations of their badness in rationalist/EA-specific causes like alleged tunnel vision about AI.
[EDITED again to add:] To whatever extent the EA community and/or the rationalist community claims to be better than others, of course it is fair to hold them to a higher standard, and take any failure to meet it as evidence against that claim. (Suppose it turns out that the rate of child sex abuse among Roman Catholic clergy is exactly the same as that in some reasonably chosen comparison group. Then you probably shouldn’t see Roman Catholic Clergy as super-bad, but you should take that as evidence against any claim that the Roman Catholic Church is the earthly manifestation of a divine being who is the source of all goodness and moral value, or that its clergy are particularly good people to look to for moral advice.) How far either EAs or rationalists can reasonably be held to be making such a claim seems like a complicated question.
I am a pessimist who works from the assumption that humans are globally a bit terrible. Thus, I don’t consider the isolated data point of “humans in group x have been caught being terrible” to be particularly novel or useful.
Reporting that I would find useful would ultimately take the form “humans in group x trend toward differently terrible from humans in other groups”, whether that’s claiming that they’re worse, differently bad, or better.
Whenever someone claims that a given group is better than most of society, the obvious next question is “better at being excellent to each other, or better at covering it up when they aren’t?”.
The isolated data point of “people in power are accused of using that power to harm others” is like… yes, and? That’s kind of baseline for our species.
And as a potential victim, reporting on misconduct is only useful to me if it updates the way I take precautions against it, by pointing out that the misconduct in a given community is notably different from that in the world at large.
I read the first half and kind of zoned out—I wish that the author had shown any examples of communities lacking such problems, to contrast EA against.
How do you expect journalism to work? The author is trying to contribute one specific story, in detail. Readers have other experiences to compare and draw from. If this was an academic piece, I might be more sympathetic.
I feel confused by this argument.
The core thesis of the post seems to rely on the level of abuse in this community being substantially higher than in other communities (the last sentence seems to make that pretty explicit). I think if you want to compellingly argue for your thesis you should provide the best evidence you have for that thesis. Journalism commonly being full of fallacious reasoning doesn’t mean that it’s good or forgivable for journalism to reason fallaciously.
I do think journalists from time to time summarizing and distilling concrete data is good, but in that case people still clearly benefit if the data is presented in a relatively unbiased way that doesn’t distort the underlying truth a lot, or omits crucial pieces of information that the journalist very likely knew but didn’t contribute to their narrative. I think journalists not doing that is condemnable and the resulting articles are rarely worth reading.
I don’t think the core thesis is “the level of abuse in this community is substantially higher than in others”. Even if we (very generously) just assumed that the level of abuse in this community was lower than that in most places, these incidents would still be very important to bring up and address.
When an abuse of power arises the organisation/community in which it arises has roughly two possible approaches—clamping down on it or covering it up. The purpose of the first is to solve the problem, the purpose of the second is to maintain the reputation of the organisation. (How many of those catholic church child abuse stories were covered up because they were worried about the reputational damage to the church). By focusing on the relative abuse level it seem like you are seeing these stories (primarily) as an attack on the reputation of your tribe (“A blue abused someone? No he didn’t its Green propaganda!”). Does it matter whether the number of children abused in the catholic church was higher than the number abused outside it?
If that is the case, then there is nothing wrong with that emotional response. If you feel a sense of community with a group and you yourself have never experienced the problem, it can just feel like an attack on something you like. The journalist might even be motivated badly (eg. they think an editorial line against EA will go down well). But I still think its a fairly unhelpful response
Of course, one could argue that the position “Obviously deal with these issues, but also they are very rare and our tribe is actually super good” is perfectly logically consistent. And it is. But the language is doing extra work—by putting “us good” next too the issue it sounds like minimising or dismissing the issue. Put another way claims of “goodness” could be made in one post, and then left them out of the sex abuse discussion. The two are not very linked.
Yes, it does matter here, since base rates matter in general.
Honestly, one of my criticisms that I want to share as a post later on is that LW ignores the base rates and focuses too much on the inside view over the outside view, but in this case, it does matter here since the analogous claim would be that the church is uniquely bad at sexual assault, and if it turned out that it wasn’t uniquely bad, then it means we don’t have to panic.
That’s the importance of base rates: It gives you a solid number that is useful to compare against. Nothing is usually nearly as unprecedented or new as a first time person thinks.
The base-rates post sounds like an interesting one, I look forward to it. But, unless I am very confused, the base rates are only ever going to help answer questions like: “is this group of people better than society in general by metric X” (You can bring a choice Hollywood producer and Prince out as part of the control group). My point was that I think a more useful question might be something like “Why was the response to this specific incident inadequate?”.
That might be the problem here, since there seem to be two different conversations, going by the article:
Why was this incident not responded to accurately?
Is our group meaningfully worse or better, compared to normal society? And why is it worse or better?
I can see how the article might be frustrating for people who know the additional context that the article leaves out (where some of the additional context is simply having been in this community for a long time and having more insight into how it deals with abuse). From the outside though, it does feel like some factors would make abuse more likely in this community: how salient “status” feels, mixing of social and professional lives, gender ratios, conflicts of interests everywhere due to the community being small, sex positivity and acceptance of weirdness and edginess (which I think are great overall!). There are also factors pushing in the other direction of course.
I say this because it seems very reasonable for someone who is new to the community to read the article and the tone in the responses here and feel uncomfortable interacting with the community in the future. A couple of women in the past have mentioned to me that they haven’t engaged much with the in-person rationalist community because they expect the culture to be overly tolerant of bad behaviour, which seems sad because I expect them to enjoy hanging out in the community.
I can see the reasons behind not wanting to give the article more attention if it seems like a very inaccurate portrayal of things. But it does feel like that makes this community feel more unwelcoming to some newer people (especially women) who would otherwise like to be here and who don’t have the information about how the things mentioned in the article were responded to in the past.
Yeah, I might want to write a post that tries to actually outline the history of abuse that I am aware of, without doing weird rhetorical tricks or omitting information. I’ve recently been on a bit of a “let’s just put everything out there in public” spree, and I would definitely much prefer for new people to be able to get an accurate sense of the risk of abuse and harm, which, to be clear, is definitely not zero and feels substantial enough that people should care about it.
I do think the primary reason why people haven’t written up stuff in the past is exactly because they are worried their statements will get ripped out of context and used as ammunition in hit pieces like this, so I actually think articles like this make the problem worse, not better, though I am not confident of this, and the chain of indirect effects is reasonably long here.
I would be appreciative if you do end up writing such a post.
Sad that sometimes the things that seem good for creating a better, more honest, more accountable community for the people in it also give outsiders ammunition. My intuitions point strongly in the direction of doing things in this category anyway.
I don’t disagree with the main thrust of your comment, but,
I just wanna point out that ‘fallacious’ is often a midwit objection, and either ‘fallacious’ is not the true problem or it is the true problem but the stereotypes about what is fallacious do not align with reality: A Unifying Theory in Defense of Logical Fallacies
Yeah, that’s fair. I was mostly using it as a synonym for “badly reasoned and inaccurate” here. Agree that there are traps around policing speech by trying to apple rhetorical fallacies, which I wasn’t trying to do here.
Mainstream academia?
A bit of searching brings me to https://elephantinthelab.org/sexual-harassment-in-academia/ :
Another paper suggests:
Mainstream academia is not free from sexual abuse.
Whataboutism is a fallacy.
It is. But if someone is saying “this group of people is notably bad” then it’s worth asking whether they’re actually worse than other broadly similar groups of people or not.
I think the article, at least to judge from the parts of it posted here, is arguing that rationalists and/or EAs are unusually bad. See e.g. the final paragraph about paperclip-maximizers.
I fail to see why it matters what other broadly similar groups of people do. Rationalists ought to predict and steer the future better than other kinds of people, and so should be held to a higher standard. Deflecting with “but all the other kids are equally abusive!” is just really stupid.
As for the article, I’m not concerned with the opinion of a journalist either; they can be confused or bombastic about the exact extent of the problem if they want, it’s rather standard for journalists; but I don’t doubt that the problem is real and hasn’t been preemptively fixed before it happened, which bothers me because the founders of this community are more than smart enough to have at least made an attempt to do so.
Whether it matters what other broadly similar groups do depends on what you’re concerned with and why.
If you’re, say, a staff member at an EA organization, then presumably you are trying to do the best you could plausibly do, and in that case the only significance of those other groups would be that if you have some idea how hard they are trying to do the best they can, it may give you some idea of what you can realistically hope to achieve. (“Group X has such-and-such a rate of sexual misconduct incidents, but I know they aren’t really trying hard; we’ve got to do much better than that.” “Group Y has such-and-such a rate of sexual misconduct incidents, and I know that the people in charge are making heroic efforts; we probably can’t do better.”)
So for people in that situation, I think your point of view is just right. But:
If you’re someone wondering whether you should avoid associating with rationalists or EAs for fear of being sexually harassed or assaulted, then you probably have some idea of how reluctant you are to associate with other groups (academics, Silicon Valley software engineers, …) for similar reasons. If it turns out that rationalists or EAs are pretty much like those, then you should be about as scared of rationalists as you are of them, regardless of whether rationalists should or could have done better.
If you’re a Less Wrong reader wondering whether these are Awful People that you’ve been associating with and you should be questioning your judgement in thinking otherwise, then again you probably have some idea of how Awful some other similar groups are. If it turns out that rationalists are pretty much like academics or software engineers, then you should feel about as bad for failing to shun them as you would for failing to shun academics or software engineers.
If you’re a random person reading a Bloomberg News article, and wondering whether you should start thinking of “rationalist” and “effective altruist” as warning signs in the same way as you might think of some other terms that I won’t specify for fear of irrelevant controversy, then once again you should be calibrating your outrage against how you feel about other groups.
For the avoidance of doubt, I should say that I don’t know how the rate of sexual misconduct among rationalists / EAs / Silicon Valley rationalists in particular / … compares with the rate in other groups, nor do I have a very good idea of how high it is in other similar groups. It could be that the rate among rationalists is exceptionally high (as the Bloomberg News article is clearly trying to make us think). It could be that it’s comparable to the rate among, say, Silicon Valley software engineers and that that rate is horrifyingly high (as plenty of other news articles would have us think). It could be that actually rationalists aren’t much different from any other group with a lot of geeky men in it, and that groups with a lot of geeky men in them are much less bad than journalists would have us believe. That last one is the way my prejudices lean … but they would, wouldn’t they?, so I wouldn’t put much weight on them.
[EDITED to add:] Oh, another specific situation one could be in that’s relevant here: If you are contemplating Reasons Why Rationalists Are So Bad (cf. the final paragraph quoted in the OP here, which offers an explanation for that), it is highly relevant whether rationalists are in fact unusually bad. If rationalists or EAs are just like whatever population they’re mostly drawn from, then it doesn’t make sense to look for explanations of their badness in rationalist/EA-specific causes like alleged tunnel vision about AI.
[EDITED again to add:] To whatever extent the EA community and/or the rationalist community claims to be better than others, of course it is fair to hold them to a higher standard, and take any failure to meet it as evidence against that claim. (Suppose it turns out that the rate of child sex abuse among Roman Catholic clergy is exactly the same as that in some reasonably chosen comparison group. Then you probably shouldn’t see Roman Catholic Clergy as super-bad, but you should take that as evidence against any claim that the Roman Catholic Church is the earthly manifestation of a divine being who is the source of all goodness and moral value, or that its clergy are particularly good people to look to for moral advice.) How far either EAs or rationalists can reasonably be held to be making such a claim seems like a complicated question.
I am a pessimist who works from the assumption that humans are globally a bit terrible. Thus, I don’t consider the isolated data point of “humans in group x have been caught being terrible” to be particularly novel or useful.
Reporting that I would find useful would ultimately take the form “humans in group x trend toward differently terrible from humans in other groups”, whether that’s claiming that they’re worse, differently bad, or better.
Whenever someone claims that a given group is better than most of society, the obvious next question is “better at being excellent to each other, or better at covering it up when they aren’t?”.
The isolated data point of “people in power are accused of using that power to harm others” is like… yes, and? That’s kind of baseline for our species.
And as a potential victim, reporting on misconduct is only useful to me if it updates the way I take precautions against it, by pointing out that the misconduct in a given community is notably different from that in the world at large.
No, it’s not, especially given that ‘whataboutism’ is a label used to dismiss comparisons that don’t advance particular arguments.
Writing the words “what about” does not invalidate any and all comparisons.