someone in rationality [...] the community [...] many rationalists [...] the collective action problem of how to allocate our attention as a community. [...] within the rationality community [...] positive effects on the community
What community?
The problems with email that you mention are real and important. I’m glad that people are trying to solve it. If you think one particular solution (such as earn.com) is unusually good and you want it to win, then it might make sense for you to do some marketing work on their behalf, such as the post you just wrote.
What I don’t understand (or rather, what I understand all too well and now wish to warn against after realizing just how horribly it’s fucked with my ability to think in a way that I am only just now beginning to recover from) is this incestuous CliqueBot-like behavior that makes people think in terms of sending email to “someone in rationality”, rather than just sending email to someone.
In the late ’aughts, Eliezer Yudkowsky wrote a bunch of really insightful blog posts about how to think. I think they got collected into a book? I can’t recommend that book enough—it’s really great stuff. (“AI to Zombies” is lame subtitle, though.) Probably there are some other good blog posts on the lesswrong.com website, too? (At least, I like mine.)
But this doesn’t mean you should think of the vague cluster of people who have been influenced by that book as a coherent group, “rationalists”, the allocation of whose attention is a collective action problem (more so than any other number of similar clusters of people like “biologists”, or “entrepreneurs”, or “people with IQs above 120″). Particularly since mentally conflating rationality (the true structure of systematically correct reasoning) with the central social tendency of so-called “rationalists” (people who socially belong to a particular insular Bay Area-centered subculture) is likely to cause information cascades, as people who naïvely take the “rationalist” brand name literally tend to blindly trust the dominant “rationalist” opinion as the correct one, without actually checking whether “the community” is doing the kind of information processing that would result in systematically correct opinions.
And if you speak overmuch of the Way you will not attain it.
You seem to be bringing up a hobbyhorse (mentally conflating rationality with “rationalists”) under a post that is at most tangentially related, which I personally think is fine but should be noted as such. (In other words I don’t think this comment is a valid criticism of the OP, if it was intended as such.)
But this doesn’t mean you should think of the vague cluster of people who have been influenced by that book as a coherent group, “rationalists”, the allocation of whose attention is a collective action problem (more so than any other number of similar clusters of people like “biologists”, or “entrepreneurs”, or “people with IQs above 120″).
Given that biologists do in fact face a collective action problem of allocating attention (which they solve using conferences and journals), it seems perfectly fine to me to talk about such a problem for rationalists as well. What is LW2 if not a (partial) solution to such a problem? (Perhaps “entrepreneurs” and “people with IQs above 120″ can’t be said to face such a problem but they’re also much bigger and less cohesive groups than “rationalists” or “biologists”.)
Thanks, the hobbyhorse/derailing concern makes sense. (I noticed that too, but only after I posted the comment.) I think going forward I should endeavor to be much more reserved about impulsively commenting in this equivalence class of situation. A better plan: draft the impulsive comment, but don’t post it, instead saving it as raw material for the future top-level post I was planning to eventually write anyway.
Luckily the karma system was here to keep me accountable and prevent my bad blog comment from showing up too high on the page (3 karma in 21 votes (including a self-strong-upvote), a poor showing for me).
The supermajority of the people that I interact with, in person and online, are people who were influenced by that book, and like me, make substantial life decisions on the bases of associated arguments. Many of them likewise interact largely with other people who were influenced by that book.
Even stronger than that, people of this category are densely socially connected. The fact that someone identifies as “a rationalist”, is pretty strong evidence that I know them, or know of them. This is in contrast with “entrepreneurs”, for instance. Even the most well-connected entrepreneurs don’t know most of the people who identify as entrepreneurs. Dito for “people with IQs over 120”, and biologists.
Why wouldn’t I draw a boundary around that cluster of people, and attempt interventions on that cluster in particular?
It seems to me that the “rationality community” is both a natural category, and a useful category.
But perhaps you’re claiming that I should use this category, but I shouldn’t give it the label of “rationality”, because then I’m I’m making the connotation (to myself) that this group is unusually rational?
A community is not relevant to the statement of the problem, but a community is relevant to the collective action problem of adopting a solution (depending on the solution). I agree that the opening sentence about sending “an email to someone in rationality” is unhealthy and condemn it with you.
But, as others said, Jacob is right to talk of “a coordination campaign to move the community” and at some point he has to name the community. (There are additional issues of whether the community exists and whether its existence or name is bad. Those are hobbyhorses.)
What community?
The problems with email that you mention are real and important. I’m glad that people are trying to solve it. If you think one particular solution (such as earn.com) is unusually good and you want it to win, then it might make sense for you to do some marketing work on their behalf, such as the post you just wrote.
What I don’t understand (or rather, what I understand all too well and now wish to warn against after realizing just how horribly it’s fucked with my ability to think in a way that I am only just now beginning to recover from) is this incestuous CliqueBot-like behavior that makes people think in terms of sending email to “someone in rationality”, rather than just sending email to someone.
In the late ’aughts, Eliezer Yudkowsky wrote a bunch of really insightful blog posts about how to think. I think they got collected into a book? I can’t recommend that book enough—it’s really great stuff. (“AI to Zombies” is lame subtitle, though.) Probably there are some other good blog posts on the lesswrong.com website, too? (At least, I like mine.)
But this doesn’t mean you should think of the vague cluster of people who have been influenced by that book as a coherent group, “rationalists”, the allocation of whose attention is a collective action problem (more so than any other number of similar clusters of people like “biologists”, or “entrepreneurs”, or “people with IQs above 120″). Particularly since mentally conflating rationality (the true structure of systematically correct reasoning) with the central social tendency of so-called “rationalists” (people who socially belong to a particular insular Bay Area-centered subculture) is likely to cause information cascades, as people who naïvely take the “rationalist” brand name literally tend to blindly trust the dominant “rationalist” opinion as the correct one, without actually checking whether “the community” is doing the kind of information processing that would result in systematically correct opinions.
And if you speak overmuch of the Way you will not attain it.
You seem to be bringing up a hobbyhorse (mentally conflating rationality with “rationalists”) under a post that is at most tangentially related, which I personally think is fine but should be noted as such. (In other words I don’t think this comment is a valid criticism of the OP, if it was intended as such.)
Given that biologists do in fact face a collective action problem of allocating attention (which they solve using conferences and journals), it seems perfectly fine to me to talk about such a problem for rationalists as well. What is LW2 if not a (partial) solution to such a problem? (Perhaps “entrepreneurs” and “people with IQs above 120″ can’t be said to face such a problem but they’re also much bigger and less cohesive groups than “rationalists” or “biologists”.)
Thanks, the hobbyhorse/derailing concern makes sense. (I noticed that too, but only after I posted the comment.) I think going forward I should endeavor to be much more reserved about impulsively commenting in this equivalence class of situation. A better plan: draft the impulsive comment, but don’t post it, instead saving it as raw material for the future top-level post I was planning to eventually write anyway.
Luckily the karma system was here to keep me accountable and prevent my bad blog comment from showing up too high on the page (3 karma in 21 votes (including a self-strong-upvote), a poor showing for me).
Actually, jeez, the great-grandparent doesn’t deserve that self-strong-upvote; let me revise that to a no-self-vote.
The supermajority of the people that I interact with, in person and online, are people who were influenced by that book, and like me, make substantial life decisions on the bases of associated arguments. Many of them likewise interact largely with other people who were influenced by that book.
Even stronger than that, people of this category are densely socially connected. The fact that someone identifies as “a rationalist”, is pretty strong evidence that I know them, or know of them. This is in contrast with “entrepreneurs”, for instance. Even the most well-connected entrepreneurs don’t know most of the people who identify as entrepreneurs. Dito for “people with IQs over 120”, and biologists.
Why wouldn’t I draw a boundary around that cluster of people, and attempt interventions on that cluster in particular?
It seems to me that the “rationality community” is both a natural category, and a useful category.
But perhaps you’re claiming that I should use this category, but I shouldn’t give it the label of “rationality”, because then I’m I’m making the connotation (to myself) that this group is unusually rational?
A community is not relevant to the statement of the problem, but a community is relevant to the collective action problem of adopting a solution (depending on the solution). I agree that the opening sentence about sending “an email to someone in rationality” is unhealthy and condemn it with you.
But, as others said, Jacob is right to talk of “a coordination campaign to move the community” and at some point he has to name the community. (There are additional issues of whether the community exists and whether its existence or name is bad. Those are hobbyhorses.)