LessWrong is about learning rationality, and applying rationality to interesting problems.
An issue is that solving interesting problems often requires fairly deep technical knowledge of a field. To use rationality to help solving problems (especially as a group), you need both people who have skills in probability/meta-cognition/other-rationality skills, as well as the actual skills directly applicable to whatever problem is under discussion.
But if you show up on LW and post something technical (or even just “specialized”) in a field that isn’t already well represented on the forum, it’ll be hard to have meaningful conversations about it.
Elsewhere on the internet there are probably forums focused on whatever-your-specialization is, but those places won’t necessarily have people who know how to integrate evidence and think probabilistically in confusing domains.
So far the LW userbase has a cluster of skills related to AI alignment, some cognitive science, decision theory, etc. If a technical post isn’t in one of those fields, you’ll probably get better reception if it’s somehow “generalist technical” (i.e. in some field that’s relevant to a bunch of other fields), or if it somehow starts one inferential unit away from the overall LW userbase.
A plausibly good strategy is to try to recruit a number of people from a given field at once, to try to increase the surface area of “serious” conversations that can happen here.
It might make most sense to recruit from fields that are close enough to the existing vaguely-defined-LW memeplex that they can also get value from existing conversations here.
Anyone have ideas on where to do outreach in this vein? (Separately, perhaps: how to do outreach in this vein?). Or, alternately, anyone have a vague-feeling-of-doom about this entire approach and have alternate suggestions or reasons not to try?
As I’ve been talking about on my shortform, I’d be excited about attracting more “programmer’s programmers”. AFAICT, a lot of LW users are programmers, but a large fraction of these users either are more interested in transitioning into theoretical alignment research or just don’t really post about programming. As a small piece of evidence for this claim, I’ve been consistently surprised to see the relatively lukewarm reaction to Martin Sustrik’s posts on LW. I read Sustrik’s blog before he started posting and consistently find his posts there and here pretty interesting (I am admittedly a bit biased because I was already impressed by Sustrik’s work on ZeroMQ).
I think that’s a bit of a shame because I personally have found LW-style thinking useful for programming. My debugging process has especially benefited from applying some combination of informal probabilistic reasoning and “making beliefs pay rent”, which enabled me to make more principled decisions about which hypotheses to falsify first when finding root causes. For a longer example, see this blog post about reproducing a deep RL paper, which discusses how noticing confusion helped the author make progress (CFAR is specifically mentioned). LW-style thinking has also helped me stop obsessing over much of the debate around some of the more mindkiller-y topics in programming like “should you always write tests first”, “are type-safe languages always better than dynamic ones”. In my ideal world, LW-style thinking applied to fuzzier questions about programming would help us move past these “wrong questions”.
Programming already has a few other internet locuses such as Hacker News and lobste.rs, but I think those places have fewer “people who know how to integrate evidence and think probabilistically in confusing domains.”
Assuming this seems appealing, one way to approach getting more people of the type I’m talking about would be to reach out to prominent bloggers who seem like they’re already somewhat sympathetic to the LW meme-plex and see if they’d be willing to cross-post their content. Example of the sorts of people I’m thinking about include:
Hillel Wayne: who writes about empiricism in software engineering and formal methods.
Jimmy Koppel: who writes about insights for programming he’s gleaned from his “day job” as a programming tools researcher (I think he has a LW account already).
Julia Evans: Writes about programming practice and questions she’s interested in. A blog post of hers that seems especially LW-friendly is What does debugging a program look like?
Last, I do want to include add a caveat for all this which I think applies to reaching out to basically any group: there’s a big risk of culture clash/dilution if the outreach effort succeeds (see Geeks, MOPs, and sociopaths for one exploration of this topic). How to mitigate this is probably a separate question, but I did want to call it out in case it seems like I’m just recommending blindly trying to get more users.
Jimrandomh recently had the interesting observation that there might have been legitimately fewer rationalists in the world prior to the invention of programming, because it actually forces you to notice when your model is broken, form new hypotheses, and test them, all with short feedback loops.
Yeah, I have a ton of confirmation bias pushing me to agree with this (because for me the two are definitely related), but I’ll add that I also think spending a lot of time programming helped me make reductionism “a part of me” in a way it wasn’t before. There are just very few other activities where you’re forced to express what you want or a concept to something that fundamentally can only understand a limited logical vocabulary. Math is similar but I think programming makes the reductionist element more salient because of the compiler and because programming tends to involve more mundane work.
Yeah, in our office-discussion at the time I think the claim was something like “Prior to programming, Math Proofs were the best way to Get The Thing, and they were slower and the feedback less clear.”
(My sense is that programming _hasn’t_ deeply given me the thing, until perhaps recently when I started getting more intentional about deliberate debugging practice. But it definitely makes sense that programming would at least open up the possibility of gaining the skill. The main remaining question in my mind is “how much does the skill transfer, by default, if you’re not deliberately trying to transfer it?”)
As someone who landed on your comment specifically by searching for what LW has said about software engineering in particular, I’d love to read more about your methods, experiences, and thoughts on the subject. Have you written about this anywhere?
Sadly, not much. I wrote this one blog post a few years back about my take on why “reading code” isn’t a thing people should do in the same way they read literature but not much (publicly) other than that. I’ll think about whether there’s anything relevant to stuff I’ve been doing recently that I could write up.
I would recommend the other writers I linked though! They are much more insightful than I anyway!
(minor note: the Jimmy and Julia links didn’t work properly, because external links need to be prefaced with “https://www.”)
This link is broken. It goes to: https://www.lesswrong.com/posts/KFBhguD7dSjtmRLeg/lobste.rs
Fixed, same HTTPS problem Raemon commented on above.
In multiple LessWrong surveys biorisk was rank has a more probable existential risk then AGI. At the same time there’s very little written on LessWrong about biorisk. If we could recruit people into our community who could represent that topic well, I think it would be very valuable.
Minor conflict of interest disclaimer: I’ve recently become much more interested in computational biology and therefore have a personal interest in having more content related to biology in general on LW.
I’d be excited about having more representation from the experimental sciences, e.g. biology, certain areas of physics, chemistry, on LessWrong. I don’t have a good sense of how many total LW users come from these fields, but it certainly doesn’t seem like many prominent posters/commenters do. The closest thing to a prominent poster who talks about experimental science is Scott Alexander.
My sense from random conversations I’ve had over the years is that there’s a lot of tacit but important knowledge about how to do experimental research and lab work well that isn’t written down anywhere and could make for interesting complementary content to the wealth of content on LW about the connection between rationality and doing theory well. There’s also an untapped treasure trove of stories about important discoveries in these areas that could make for good LW post series. I’d love to see someone take me through the history of Barbara McClintock’s discoveries or the development of CRISPR from a rationalist perspective (i.e. what were the cognitive strategies that went along with discovering these things). There are books on discoveries like this of course, but there are also books on most of the material in the Sequences.
Having more LWers from experimental sciences could also provide a foundation for more detailed discussion of X-risks outside of transformative AI, bio-risks in particular.
In terms of attracting these sorts of people, one challenge is that younger researchers in these areas in particular tend to have long hours due to the demands of lab work and therefore may have less time to post on LW.
Biology-nerd LWer here (or ex-biology-nerd? I do programming as a job now, but still talk and think about bio as a fairly-high-investment hobby). BS in entomology. Disclaimer that I haven’t done grad school or much research; I have just thought about doing it and talked with people who have.
I suspect one thing that might appeal to these sorts of people, which we have a chance of being able to provide, is an interesting applied-researcher-targeted semi-plain-language (or highly-visual, or flow-chart/checklist, or otherwise accessibly presented) explanation of certain aspects of statistics that are particularly likely to be relevant to these fields.
ETA: A few things I can think of as places to find these people are “research” and “conferences.” There are google terms they’re going to use a lot (due to research), and also a lot of them are going to be interested in publishing and conferences as a way to familiarize themselves with new research in their fields and further their careers.
Leaning towards the research funnel… here’s some things I understand now that I did not understand when I graduated, many of which I got from talking/reading in this community, which I think a “counterfactual researcher me” would have benefited from a lucid explanation of:
Transferring intuitions around normalization
how to do it, why to do it
see how it eliminates spurious leads in data like goddamned magic
How to Handle Multiple-Hypothesis-Testing
A really good explanation of MCMC
Implied “priors,” model assumptions, and how to guess which ones to reach for and screen out the ones that are wrong
When is trying to add ML to your research a good or bad idea
Things I think we’ve done that seem appealing from a researcher perspective include...
Some of the stats stuff (I may not remember the precise sources, but this is where I picked up an understanding of most of what I listed above)
Thoughtful summaries/critiques of certain papers
Scott Alexander’s stuff on how to thoughtfully combine multiple semi-similar studies (so very much of it, but maybe one in particular that stood out for me was his bit on the funnel graph)
I vaguely remember seeing a good explanation of “how and why to use effect sizes” somewhere
(...damn, is Scott really carrying the team here, or is this a perception filter and I just really like his blog?)
Small sample sizes, but I think in the biology reference class, I’ve seen more people bounce off of Eliezer’s writing style than the programming reference class does (fairly typical “reads-as-arrogant” stuff; I didn’t personally bounce off it, so I’m transmitting this secondhand). I don’t think there’s anything to be done about this; just sharing the impression. Personally, I’ve felt moments of annoyance with random LWers who really don’t have an intuitive feel for the nuances for evolution, but Eliezer is actually one of the people who seems to have a really solid grasp on this particular topic.
(I’ve tended to like Elizer’s stuff on statistics, and I respected him pretty early on because he’s one of the (minority of) people on here who have a really solid grasp of what evolution is/isn’t, and what it does/doesn’t do. Respect for his understanding of a field-of-study I did understand, rubbed off as respecting him in fields of study he understood better than I did (ex: ML) by default, at least until my knowledge caught up enough that I could reason about it on my own.)
((FWIW; I suspect people in finance might feel similarly about “Inadequate Equilibria,” and I suspect they wouldn’t be as turned off by the writing style. They are likely to be desirable recruits for other reasons: finance at its best is fast-turnaround and ruthlessly empirical, it’s often programming or programming-adjacent, EA is essentially “charity for quantitatively-minded people who think about black swans,” plus there’s something of a cultural fit there.))
Networking and career-development-wise… quite frankly, I think we have some, but not a ton to offer biologists directly. Maybe some EA grants for academics and future academics that are good at self-advocacy and open to moving. I’ve met maybe a dozen rationalists I could talk heavy bio with, over half of which are primarily in some other field at this point. Whereas we have a ton to offer programmers, and at earlier stages of their careers.
(I say this partially from personal experience, although it’s slightly out-of-date: I started my stay in the Berkeley rationalist community ~4 years ago with a biology-type degree. I had a strong interest in biorisk, and virology in particular. I still switched into programming. There weren’t many resources pointed towards early-career people in bio at the time (this may have changed; a group of bio-minded people including myself got a grant to host a group giving presentations on this topic, and were recently able to get a grant to host a conference), and any that existed was pointed at getting people to go to grad school. Given that I had a distaste for academia and no intention of going to grad school, I eventually realized the level of resources or support that I could access around this at the time was effectively zero, so I did the rational thing and switched to something that pays well and plugged in with a massive network of community support. And yes, I’m a tad bitter about this. But that’s partially because I just had miscalibrated expectations, which I’m trying to help someone else avoid.)
Wow, thanks for your detailed reply! I’m going to just sort of reply to a random sampling of stuff you said (hope that’s OK).
Makes sense, I’ve been learning more statistics recently and would have appreciated something like this too.
Speculation but do you think this might also be because people in more applied sciences tend to be more skeptical of long chains of reasoning in general? My sense is that doing biology (or chemisty) lab work gives you a mostly healthy but strong skepticism of theorizing without feedback loops because theorizing about biology is so hard.
That’s fair. I do think it’s worth distinguishing between the rationalist community in a specific case and LW itself, even though they’re obviously strongly overlapping. I say this because I can imagine a world where LW attracts a mostly socially separate group of biology-interested folks who post and engage but don’t necessarily live in Berkeley.
+1 to targeting finance-types, though probably many/most are savvy enough that they won’t find EA compelling.
We used to have posts like https://www.lesswrong.com/posts/pWi5WmvDcN4Hn7Bo6/even-if-you-have-a-nail-not-all-hammers-are-the-same , so quite a few people would read it.
Finance. Trading specifically.
I’d be interested in you saying more words about this – both about why it seems like a particularly promising area, and also if you have recommendations of who to approach or how to go about it.
One question might be “are there particular trading bloggers that seem “LW-adjaecent” that we could get to crosspost here?
I’d be very interested to see someone talk about how many forces in finance are driven by superstition about superstition.. for instance, how you can have situations where nobody really believes tulips are valuable, but how disastrous things must now happen as a result of everyone believing that others believe that others believe that [...seeming ad infinitum...] tulips are valuable. Where do these beliefs come from? How can they be averted? This kind of question seems very much in this school’s domain.
There would have to be some speculation about how a working logic of self-fulfilling prophesy like FDT would wrangle those superstitions and drive them towards a sane equilibrium of optimal stipulations. I’d expect FDT to have a lot to say.
I would be interested in seeing more applied fields, and also specializations which operate at the intersection of multiple fields. Some examples include:
Operators, in the sense of people with executive responsibility. I have enjoyed reading the after-action reports from the organizing experiences and foundation-forming to come from this website and EA.
Finance, which is essentially the field of applied distribution of risk. We have finance people on here, but there seems to be little content in terms of top-level posts from them (the easiest way to tell there are finance people present is to look at the top-level finance posts and then look at the criticism in the comments).
Industrial or Systems Engineering, which are fields dedicated to integrating other fields and applied optimization of the group all together.
The adjacent memeplex of Effective Altruism seems to have a bunch of operations and finance people in it.
We might consider trying to target people who are connected to teaching or instruction in their area of expertise somehow. I expect the average level of engagement with a new subject is quite a bit deeper here than in most other communities, so we might be in a position to offer an audience of motivated learners as an enticement to them. Simultaneously, the instruction experience will help with the problem of technical posts having too high a threshold to engage with them.
I’d make an argument for ‘soft-sciences’ and humanities. Philosophy, cultural anthropology, history, political science, sociology, literature, and maybe even gender studies. Computer science, mathematics, economics, and other STEM-heavy fields are already pretty well represented within the current LW community.
The focus on group rationality and developing a thriving community seems like it could benefit from the expertise these fields bring to the table. This might also reduce the amount of ‘reinventing the wheel’ that goes on (which I don’t necessarily think is a bad thing but also consumes scarce cognitive resources).
Further, I think there’s a case to be made that a lot of the goals of the rationalist movement could be furthered by strengthening connections to serious academic fields that are less likely to come into memetic contact with rationalist ideas. If nothing else, it would probably help raise the sanity waterline.
-
We’re at a point where gender studies shouldn’t even be considered part of the humanities anymore, I’d say. As you remind us, they’re severely in denial about what biology, medicine and psychology have established and their experimental data. They’re the intellectual equivalent of anti-vax “activists” (except that the latter have yet to reach the same degree of entryism and grift).
There are other adjacent fields that are similarly problematic, being committed to discredited ideas like Marxist economics, or to what’s sometimes naïvely called “post-modernism” (actually a huge misreading of what the original postmodernists were in fact trying to achieve!). All of that stuff is way too toxic and radioactive to even think about seeking it out explicitly.
I understand and to largely share your concerns. Theoretically, there’s a distinction between academic and activist gender studies and while the latter probably has almost nothing to offer and would likely just cause toxicity even if acting in good faith the former might have more value. I am not confident about the degree to which this distinction between academic study and activist action exists in actual fact, though.
>This is a discipline that can only exist by the deliberate denial of the medical and psychological fields and all the reproducible experimental data of said fields.
I’m not sure that this is true. The fundamental idea that ‘gender’ is an interesting concept that interacts with culture and biology in sometimes surprising ways seems valid, so saying that there is _no way_ gender studies can exist as a field without ignoring medical facts strikes me as a much stronger statement than I’m comfortable with.
As much as there is a reading of gender studies that is opposed to science and rigour, I think there is a reading that paints modern gender studies and ‘LGBTism’ as a surprisingly rational movement, as it injects nuance (consider the distinction they make between biological sex and psychological gender) and has neat transhumanist themes (the LGBT biohacker movement is really interesting and is actually what made me consider any of this as being worth a second glance, check out Ada Powers on twitter for the sort of thing I’m talking about here).
Gender studies has been a source of a lot of toxicity and its interactions with the rationalist community have not been a source for a lot of hope (I’m familiar with Scotts Alexander and Aaronson’s difficulties with feminist activists in the past) but I suspect/hope that there are _individual scholars_ in the field who try to avoid toxicity and apply whatever rigour they can to their work and that by finding and reaching out to these people we might begin development of a more rationalist-friendly gender studies with higher norms for outgroup tolerance and evidence.
Again, I’m not super confident in this and I think there is a decent chance that this will wind up being pointless but it still seems worth spending a little time investigating.
Any reasonable scholar who’s in gender studies faces a high reputational risk if they would debate on LessWrong in a reasonable way about their field. Any field that has dogma’s that aren’t allowed to be publicly debated has a problem with the kind of open discussion we are having here.
The question is not just whether it’s pointless but about whether it’s potentially harmful.
This is a stronger case than the one Anderson made, I think, and it is one I take seriously (which is why I plan to approach this problem by reading material first to see what the landscape is actually like).
I agree with this statement, but the question is whether modern gender studies is actually such a field. Trying to make bold claims about the quality of academic discussion in a field neither I nor my conversation partner has actually investigated seriously strikes me as a futile exercise. I think it’s probably a bad idea to judge the quality of academic feminism by the merits of tumblr or ‘pop’ feminism in the same way it would be unfair to judge skeptic movements by the intellectual standards of r/atheism.
I’m also deeply skeptical of the idea that inviting feminists to participate in discussion would lead to an opening of the hellgates. LessWrong is a community that has examined infohazards and sees participants from a wide variety of political backgrounds including many that are considered extreme by most people, so my prior is that we’re better than most communities at managing political discord in a sane way.
We lost a room in which we held LW meetups in Berlin because LW discusses topics that shouldn’t be discussed. The discussion in itself is ‘unsafe’ regardless of how you discuss or what conclusions are reached.
That’s norms for using a meeting room. When it comes to norms that the gender studies community expects there own members to follow, a person who has a reputational stake in the community has a lot more to lose from violating norms in that way.
This isn’t even a question of the academic quality of their discourse. a/atheism doesn’t attack people in a way that destroys careers and isn’t dangerous to anyone. This is different here. I wouldn’t want a lone reasonable person in the gender studies field to lose their social capital and/or career for associating with this place.
The standard way LW historically handled politics is by discouraging it’s discussion. SSC did things differently and payed a price for it.
That’s all separate from the actual quality of the academic discourse but it matters. As far as the discourse goes https://quillette.com/2019/09/17/i-basically-just-made-it-up-confessions-of-a-social-constructionist/ is an article by an insider where he reflects on the low standards he used over the decades.
Amusingly, the article you linked redirected to a different article which seems to reinforce your first point and I think helped clarify for me the exact dynamics of the situation. The author defends Dr. Littman’s paper on what she terms ‘rapid-onset gender dysphoria’ against the heavy backlash it received (mostly on twitter, it seems) and especially Harvard’s response to that backlash.
I find it difficult to imagine that healthy academic discourse could take place in an environment that conflict-heavy. Critically, this does not require the field itself to be nonsense but rather so deeply joined to the social justice culture war that the normal apparatuses of academia are hijacked.
This has raised my estimation of the risk of inviting gender studies researchers to participate in discussions on LW significantly, especially since as you point out, that risk runs in both directions.
There may still be ideas worth salvaging from the gender studies community and I’m really curious at what a ‘rationalist gender studies’ field looks like but the risk does look salient enough it may not be worth the effort.
You lost your meeting room because you were discussing (what I assume to be) politically sensitive topics. I think we’d agree that intellectual progress halts when important topics become too charged to touch and I don’t want feminism to become like that in the rationalist sphere.
But rationalist sphere != LessWrong and perhaps this isn’t the right place for progress in that area to happen. You bring up the differing approaches of SSC and LW and I actually quite like SSC’s approach of high-discussion-norms while not shying from sensitive topics, but you’re not wrong about paying a price for that.
So now I’m left wondering, if not here, then where? Where could rational-adjacent people sanely interact with feminists and sociologists and others in ‘challenging’ fields and what would the discussion there have to look like to keep people safe?
The answer might be ‘nowhere’. This could be a fundamentally irreconcilable difference and if that’s the case then I will be sad about it and move on. I don’t think I have enough evidence to conclude this yet, but I will concede that is this place does exist, LessWrong probably isn’t it.
No, I lost it because it was a LessWrong meetup and there are such discussions on LessWrong (and our meetup.com page says SSC/LW, so SSC association was also a problem). The problem was not that the topics might be discussed on the meetup with was more applied rationality focused.
The problem was one of association, not one of meetup content. We could have held the meetup if we wouldn’t link from any LW or SSC branded page and called it ‘rationality meetup’.
The ‘Darwinian Gender Studies’ facebook group seems one place worth mentioning. TheMotte was founded to have a place where discussion could be happen with less collateral damage.
There might still be a risk for any insider to participate in them with their public identity attached. Private discussions behind closed doors would be less risky.
Perhaps /r/TheMotte?! (Backstory.)
Hmm, that might be worth exploring. Thanks
Name three?
-
I contest that those are not actually claims made by sociologists. Or if they are, they are minority opinions (in which case there would be other sociologists debunking them).
As a test, if you provide links to sociologists (or academic feminists/gender studies researchers) making each of those claims I will try to find others within the same field arguing against them.
-
To be perfectly honest, I’ve never stepped into a sociology department except to take classes that happened to be scheduled in the sociology building. The closest I’ve studied to sociology or gender studies in a formal setting was an introductory folklore course.
That being said, your statement sounds concerningly weakmanish, like the sort of criticism one would level at a field is one’s only experience with it were extremists and people complaining about the extremists. After some googling I found an article in the Huffington Post by Dr. Carol Morgan, who has a PhD in Gender Communication, that references the nature vs nurture debate (and provides anecdotal evidence on the nature side for the author’s sons’ gender identities).
This does not sound like a paragraph that would come out of a field that has decisively settled on ‘nurture’. I think the extent to which gender behaviors are biologically determined is still quite hotly debated within gender studies and the closest to a consensus view I can find is “they both play a role but people tend to naively overestimate the position of nature”.
Perhaps you should reevaluate what gender studies researchers actually believe?
-
Are you claiming that none of the differences between men and women are cultural? To me, that seems as obviously incorrect as saying all of them are. Not to go all ‘fallacy of the grey’ here but this really does seem to be an issue where both sides are a major influence. IQ is around 50% heritable, the other 50% also matters, though.
My view is that if we accept both biological and cultural influences on behavior then behavioral geneticists, neurologists, evolutionary psychologists, etc. focus their effort on the biological side and sociologists and academic feminists focus on the cultural side. Can you not see how, at least in theory, this is an interesting dynamic? Even if it were the case that all academic feminists think all observed social differences between men and women come from social causes (which, again, I think is a weakman argument) can’t you see that there’s something worth investigating there?
There is a fascinating feedback loop between biology and culture and the ways in which (mostly) static biological realities are interpreted culturally in many different ways and how this can shape the lives of people living within that culture are varied and difficult to describe simply. One of the things that I love so much about the rationalist community is their daring attempts to tackle really challenging issues in a clear manner. Things like the Human’s Guide to Words sequence or SSC’s Categories post take a look at the nuance and complexity of language and culture and make an honest, and in my view surprisingly successful, attempt to pull coherent, useful models out of the mud. I think we should do this for more stuff and I think gender is one of the issues that could really use a nice, demystifying treatment.
And when I ask myself where I might find people who could help with this ‘demystify gender’ project, I recall that there is an entire field of study that deals with this topic specifically. Even if there’s a bunch of crap coming out of that field, hell, even if 95% of it is people trying to find ways to confuse the issue harder or just trying to score political points, surely there’s clear-thinking people in there somewhere, right? There are people who went to school to study this stuff because they found it interesting in the same way some people find probability theory or linguistics interesting.
I’m not saying we should open the floodgates to every tumblr feminist with a grudge, but do you really think that trying to find open-minded gender studies researchers who would be willing to engage in adversarial collaboration would be such a terrible idea? Do you really take such an uncharitable view of the field you can’t imagine any usable work coming out of it?
This sounds like it’s written by a person who’s not quite clear what X percent heritable means. Apart from that making up numbers like this for rhetorical purposes and treating them as if they are factual is bad form.
The right answer to the nature vs. nature debate isn’t it’s 50-50 but: That’s a bad question and a bad frame for understanding reality.
Instead of debating nature vs. nature one should look at the empirical findings we have and build up a view on the world based on them.
I agree, that was a confused point for me to make that didn’t advance my main argument. The initial claim Anderson made was that the field of gender studies advocated total social determination of all observed differences between genders, I argued that this was not the case and provided an instance of a gender communications researcher discussing the biological influences on gendered behavior.
The point about IQ was a half remembered factoid from a metastudy I read a while back and I’ve been unable to find subsequently so it’s likely misremembered. It’s irrelevant to the discussion though, I think.
Exactly 50-50 would be very surprising result for a meta-study. “50% heritable” has an exactness that “around half heritable” doesn’t have.
Treating both of those the same way is what I would expect from people who don’t respect actual numbers.
It was, as I admitted, a mistake. I was being inexact as it was not critical for my central point, if it was I would have looked it up, failed to find it, and adjusted my approach (or more likely, left out IQ altogether). I’m unsure what continuing to belabor this accomplishes aside from chastising me for insufficiently respecting numbers.
You admitted a mistake but it wasn’t the mistake for which I was criticizing you. I don’t have a problem with people misremembering numbers. This prompted me to explain my criticism.
-
Alright, a different angle then. If we did find some academic feminists or gender studies researchers who were willing to engage in good faith, serious discussion without trying to be activist or throwing around accusations of -isms or -phobics, would you object to their presence in the community? The hostility you’ve shown towards an entire field is something I find deeply concerning.
Perhaps you and I just have fundamentally different approaches towards outgroups since I honestly cannot think of a single group I would treat the way you’ve been treating feminists in this discussion.
New age pagans, reactionaries, anarchists, neoliberals, small-c-conservatives, and even the alt-right; I consider these to be among my outgroups and I could make major criticisms of their core philosophies as well as how they generally conduct themselves in discourse. But if a member of any one of them actually wanted to engage me in a real discussion in good faith I would take them up on it (time permitting, of course) and if they brought up evidence I had overlooked or perspectives I hadn’t considered then I would gladly update my views in response.
This is pretty close to my entire ethos; it’s the reason I became a rationalist in the first place and the reason I think the rationalist community has a chance to help the world where so many ‘grand vision’ movements have failed. But we have to be willing, no, eager, to engage our ideological opponents and take from them what value we can.
When I see you repeating antifeminist talking points and taking a dramatically uncharitable view of a huge academic field and political movement (and yes, I am bothered by the extent to which those two overlap) which seems to be informed by their most vitriolic and toxic members (and yes, the more moderate members seem to do frustratingly little to reign in their extremist counterparts) what I keep thinking is: we’re supposed to be better than this.
I’ve decided to interpret this as genuine. Throughout this whole conversation I’ve been annoyed at you for not engaging with what gender studies scholars actually believe, but my exposure to their ideas has basically been Wikipedia, some mild googling, and popular media. We’ve been going back and forth about whether feminists can argue coherently and in good faith and whether the field of gender studies is suitably rigorous but I’m only just now realizing the best way to resolve the question is to read some of their stuff critically and form my own opinions.
I’ve got a hypothesis that feminist social theory could be a helpful addition to the ever-growing rationalist canon and a way to test that just by doing a little reading. I’ll let you know if it turns out you were right all along.
Give specific examples. What do gender theorists claim to be trying to do, and how are they failing to do it?
-
Aside from what’s already here, I can think of a few “character profiles” of fields that would benefit from LessWrong infrastructure:
Hard fields that are in decent epistemic health but could benefit from outsiders and cross pollination with our memeplex (e.g. economics).
Object level things where outside experts can perform the skill but the current epistemological foundations are so shaky that procedural instructions work poorly (e.g. home cooking).
Things that are very useful where good information exists but finding it requires navigating a lemon market (e.g. personal finance).
Fields that have come up regularly as inputs into grand innovations that required knowledge from multiple areas (e.g. anything Elon needed to start his companies)
I don’t think the bottleneck is lack of recruitment though, the problem is that content has no place to go. As you rightly point out, things that aren’t interesting to the general LW audience get crickets. I have unusual things I really want to show on LessWrong that are on their 5th rewrite because I have to cross so many inferential gaps and somehow make stuff LW doesn’t care about appealing enough to stay on the front page.