All right. I think I did overreact. And it’s not right to try to defensively turn it into a point about something bigger, even if that bigger thing is much more important.
However, since no-one has talked very much about why an intervention like this looks so bad, let me try to do so. But first, let’s bear in mind the sort of intervention which normally leads to criticism of editors and moderators: negative interventions, like deletions and blocks. This was actually a promotion, so it’s a different sort of faux pas (as wedrifid aptly termed it).
Basically, Eliezer took someone’s essay, give it a new title that is a psychological self-assessment written in the first person (that is, written as if it was produced by the article’s author, rather than by the editor), and then promoted the essay on the front page. The new title went immediately into Google, into the RSS feed (from where it also ended up on Twitter), and who knows how many other places. Then he asked the author if she liked her new look.
The old title was somewhat forward-looking: “Why Less Wrong hasn’t changed my life (yet)”. The new title is not: it’s just “How I Ended Up Non-Ambitious”. The new title corresponds more directly to the content of the essay, but presumably the old title reflected authorial intent as well. Maybe there is a push and pull going on inside Swimmer963 between a will to realism, some LW social pressure to attempt great things, authentic personal ambition versus a desire for a comfortable life and a desire to avoid fiascos, who knows what else.
The reason this situation even arises is because of LW’s ambiguous status halfway between “group blog” and “rationality broadsheet”. If LW was a magazine with columnists, we wouldn’t be so surprised at such editorial interventions. But if someone hacked into your personal blog and changed the titles of all the posts according to their private understanding of what the posts were really about, that would feel very invasive.
In practice, people do try to shape their posts so that they meet an imagined LW standard—I don’t just mean quality of reasoning or clarity of expression; I also mean a tone whereby the author says “There is this issue that you run across in life, how can we deal with it? Here’s how.” There’s a competition to exhibit methods of self-improvement that one has personally discovered and employed; who is the best at helping others to help themselves? Now that there’s a Discussion section, there’s less need to shape every post into that form; that’s now reserved for the featured articles. But I’ve certainly shaped one or two of my posts, somewhat artificially, to conform to an imagined LW style of communication.
So we’re all aware that there are standards and conventions which apply to ambitious :-) LW posts, and we can expect that they’ll be moved into Discussion if they’re judged not good enough, that people may ask us to rewrite them, and so on. But this is the first time I can remember when a direct modification like this was made by a moderator. He gave an opportunity for dissent a short time later, but, it still seems like bad practice, and this seems to have been recognized.
I occasionally ponder what LW’s objective place in the scheme of things might be. Will it ever matter as much as, say, the Vienna Circle? Or even just as much as the Futurians? - who didn’t matter very much, but whose story should interest the NYC group. The Futurians were communists, but that was actually a common outlook for “rationalists” at the time, and the Futurians were definitely future-oriented.
Will LW just become a tiresome and insignificant rationalist cult? The more that people want to conduct missionary activity, “raising the sanity waterline” and so forth, the more that this threatens to occur. Rationalist evangelism from LW might take two forms, boring and familiar, or eccentric and cultish. The boring and familiar form of rationalist evangelism could encompass opposition to religion, psych 101 lectures about cognitive bias, and tips on how optimism and clear thinking can lead to success in mating and moneymaking. An eccentric and cultish form of rationalist evangelism could be achieved by combining cryonics boosterism, Bayes-worship, insistence that the many-worlds interpretation is the only rational interpretation of quantum mechanics, and the supreme importance of finding the one true AI utility function.
It could be that the dominant intellectual and personality tendencies here—critical and analytical—will prevent serious evangelism of either type from ever getting underway. So let’s return for a moment to the example of the Vienna Circle, which was not much of a missionary outfit. It produced a philosophy, logical positivism, which was influential for a while, and it was a forum in which minds like Godel and Wittgenstein (and others who are much lesser known now, like Otto Neurath) got to trade views with other people who were smart and on their wavelength, though of course they had their differences.
Frankly I think it is unlikely that LW will reach that level. The Vienna Circle was a talking shop, an intellectual salon, but it was perhaps one in ten thousand in terms of its lucidity and significance. Recorded and unrecorded history, and the Internet today, is full of occasions where people met, were intellectually sympatico, and managed to elaborate their worldview in a way they found satisfactory; and quite often, the participants in this process felt they were doing something more than just personally exciting—they thought they were finding the truth, getting it right where almost everyone else got it wrong.
I appreciate that quite a few LW contributors will be thinking, I’m not in this out of a belief that we’re making history; it’s paying dividends for me and my peers, and that’s good enough. But you can’t deny that there is a current here, a persistent thread of opinion, which believes that LW is extremely important or potentially so, that it is a unique source of insights, a workshop for genuine discovery, an oasis of truth in a blind or ignorant world, etc.
Some of that perception I believe is definitely illusory, and comes from autodidacts thinking they are polymaths. That is, people who have developed a simple working framework for many fields or many questions of interest, and who then mistake that for genuine knowledge or expertise. When this illusion becomes a collective one, that is when you get true intellectual cultism, e.g. the followers of Lyndon Larouche. Larouche has an opinion on everything, and so to those who believe him on everything, he is the greatest genius of the age.
Then, there are some intellectual tendencies here which, if not entirely unique to LW, seem to be expressed with greater strength, diversity, and elaboration than elsewhere. I’m especially thinking of all the strange new views, expressed almost daily, about identity, morality, reality, arising from extreme multiverse thinking, computational platonism, the expectation of uploads… That is an area where I think LW would unquestionably be of interest to a historian of technological subcultural belief. And I think it’s very possible that some form of these ideas will give rise to mass belief systems later in this century—people who don’t worry about death because they believe in quantum immortality, popular ethical movements based on some of the more extreme or bizarre conclusions being deduced from radical utilitarianism, Singularity debates becoming an element of political life. I’m not saying LW would be the source of all this, just that it might be a bellwether of an emerging zeitgeist in which the ambient technical and cultural environment naturally gives rise to such thinking.
But is there anything happening here which will contribute to intellectual progress? - that’s my main question right now. I see two ways that the answer might be yes. First, the ideas produced here might actually be intellectual progress; second, this might be a formative early experience for someone who went on to make genuine contributions. I think it’s likely that the second option will be true of someone—that at least one, and maybe several people, who are contributing to this site or just reading it, will, years from now, be making discoveries, in psychology or in some field that doesn’t yet exist, and it will be because this site warped their sensibility (or straightened it). But for now, my question is the first one: is there any intellectual progress directly occurring here, of a sort that would show up in a later history of ideas? Or is this all fundamentally, at best, just a learning experience for the participants, of purely private and local significance?