The rationalist movement, rationality community,1 rationalsphere or rationalistsphere2 represents a set of modes of bayesian thinking from self-described rationalists or ‘aspiring rationalists’ typically associated with the Less Wrong diaspora and their associated communities.
This page was last properly edited in December 2017, around the time LW2.0 was getting started. It may not accurate reflect the state of affairs today – either culture or how the community sees itself. – Ruby
History
Following the wind-down of the first era of Less Wrong in 2014, triggered primarily via the departure of Eliezer Yudkowsky and other top writers, the formerly closer-knit on and offline groups continue with a lessened central focus.
Arguably, Scott Alexander has emerged as the most influential remaining figurehead and Slate Star Codex commands the highest single point of engagement within the restructured community.
In 2016-2017, discussion of revival occurred and Vaniver was made benevolent dictator for life. He in turn empowered Oliver Habryka to form a team and launch the LessWrong 2.0 project which runs the current site.
Illustration
A popular illustration of the communities comprising and related to the movement was created in 2014 on Slate Star Codex34 (above).
A 2017 conceptual Venn diagram of the ‘rationalsphere’ has also been created.5
Definitions
Scott Alexander has also suggested the following definition in 2016:6
The rationalist community is a group of people (of which I’m a part) who met reading the site Less Wrong and who tend to hang out together online, sometimes hang out together in real life, and tend to befriend each other, work with each other, date each other, and generally move in the same social circles. Some people7 call it a cult, but that’s more a sign of some people having lost vocabulary for anything between “totally atomized individuals” and “outright cult” than any particular cultishness.
But people keep asking me what exactly the rationalist community is. Like, what is the thing they believe that makes them rationalists? It can’t just be about being rational, because loads of people are interested in that and most of them aren’t part of the community. And it can’t just be about transhumanism because there are a lot of transhumanists who aren’t rationalists, and lots of rationalists who aren’t transhumanists. And it can’t just be about Bayesianism, because pretty much everyone, rationalist or otherwise, agrees that is a kind of statistics that is useful for some things but not others. So what, exactly, is it?
This question has always bothered me, but now after thinking about it a lot I finally have a clear answer: rationalism is the belief that Eliezer Yudkowsky is the rightful caliph.
No! Sorry! I think “the rationalist community” is a tribe much like the Sunni or Shia that started off with some pre-existing differences, found a rallying flag, and then developed a culture.
Other definitions include:8
...typical rationalist philosophical positions include reductionism, materialism, moral non-realism, utilitarianism, anti-deathism and transhumanism. Rationalists across all three groups tend to have high opinions of the Sequences and Slate Star Codex and cite both in arguments; rationalist discourse norms were shaped by How To Actually Change Your Mind and 37 Ways Words Can Be Wrong, among others.
And:9
...a community that call themselves Rationalists, that read ‘high-IQ sites’ such as Marginal Revolution, Less Wrong, and Slate Star Codex, and according to various surveys, identify as liberal, are atheist or agnostic, and, in general, hold a ‘realist’ philosophical worldview.
Additional attributes
A crowdsourced list of traits of community:10
Seeing the Prisoner’s dilemma and other game theory applications everywhere
Being perpetually vigilant of personal biases
Epistemic rationality through constantly aligning one’s beliefs as closely as possible with the actual state of the world
Individualistic vs Organizational
Raemon argues the rationalist movement could be subdivided into individualistic and organisational views.11 Project Hufflepuff attempts to strengthen both approaches.
The Rationalsphere
Contains key modes of thinking for the individual including:
Truthseeking—biases, empiricism etc
Impact—making the world a better place ( e.g. effective altruism, AI safety )
Human—becoming a better person
On the other hand:
The Rationality Community
Encompasses the major subscribing organisations ( CFAR, Giving What We Can etc ) as well as the many meetup groups , friends and relationships discovered from participation in the community.
Rationalist-adjacent
Adjacent ideas include:12
There are people who agree on few to no rationalist positions but still like going to our parties and reading our blog posts. I coined the term “rationalist-adjacent” for this group before I got the idea that the names of all subdivisions of the rationalist community should begin with the letter C...A lot of Less Wrong references a lot of nerd culture, such as catgirls, anime, fanfiction, Harry Potter, My Little Pony, etc
Neoreaction movement,13 - A notoriously adjacent idea whist being explicitly refuted by figures such as Eliezer1415 and Scott,16 is often actively related by critics.171819
The Facebook group formally known as ‘LessWrong’ now ‘Brain Debugging Discussion’
Wider rationalist fiction
Skepticism of term
There is a some drive to avoid any ‘isms’ and instead focus on ‘rationality’ rather than ’rationalism.23
Culture
The culture is often defined by its saying such as those featured in rationalists-out-of-context.tumblr.com.
Related organizations
See also
References
Economics, stale memes, and distraction from productive activity↩
There doesn’t appear to be clearly preferred term—August 2017↩
http://slatestarcodex.com/2014/09/05/mapmaker-mapmaker-make-me-a-map/↩
Created with Photoshop and Fractal Mapper. Uses a lot of free clipart. Relevant tutorials at the Cartographers Guild forums.↩
http://lesswrong.com/r/discussion/lw/ov2/what_exactly_is_the_rationality_community/↩
http://slatestarcodex.com/2016/04/04/the-ideology-is-not-the-movement/↩
https://thingofthings.wordpress.com/2015/05/07/divisions-within-the-lw-sphere/↩
http://greyenlightenment.com/defining-and-understanding-rationalism/↩
https://www.reddit.com/r/slatestarcodex/comments/65cnar/definitions_of_the_rationalist_movement/↩
https://web.archive.org/web/20130424060436/http://habitableworlds.wordpress.com/2013/04/21/visualizing-neoreaction/↩
http://yudkowsky.tumblr.com/post/142497361345/this-isnt-going-to-work-but-for-the-record-and↩
http://lesswrong.com/lw/fh4/why_is_mencius_moldbug_so_popular_on_less_wrong/↩
http://slatestarcodex.com/2013/10/20/the-anti-reactionary-faq/↩
https://techcrunch.com/2013/11/22/geeks-for-monarchy/↩
https://social-epistemology.com/2016/09/23/the-violence-of-pure-reason-neoreaction-a-basilisk-adam-riggio/↩
http://slatestarcodex.com/2013/04/06/polyamory-is-boring/↩
http://www.jdpressman.com/public/lwsurvey2016/Survey_554193_LessWrong_Diaspora_2016_Survey(2).pdf↩
“Some people7 call it a cult”
—
In distinguishing between a cult and something better:
“And if in your spare time you consort simply with the people you like, you will again find that you have come unawares to a real inside: that you are indeed snug and safe at the centre of something which, seen from without, would look exactly like an Inner Ring. But the difference is that its secrecy is accidental, and its exclusiveness a by-product, and no one was led thither by the lure of the esoteric: for it is only four or five people who like one another meeting to do things that they like. This is friendship. Aristotle placed it among the virtues. It causes perhaps half of all the happiness in the world, and no Inner Ringer can ever have it.”
—CSL, The Inner Ring, 1944
From the old discussion page on LW1.0 wiki:
Talk:Rationalist movement
This is a joke taken out of its context in the article. I think the line should be replaced with (...) if you want to leave the idea of [something was here in the original article].
I’m not motivated enough to fight over it. My arguments are :
it lacks the context of a whole section of the article
it could be taken out of context by those who consider us outgroup and
even more so when it’s there on the wiki on the site where the movement began tribening
i’d prefer if scott hadn’t formulated it this way because i find the idea of caliph eliezer fucking terrifying, and since the movement has that funny habit of engaging with its critics as if they were part of it, a more accurate, albeit less funny, formulation would have been “rationalism is the movement that discusses whether the rightful caliph is Eliezer Yudkowsky.”
Lead
I don’t think we have a very good lead for this article: what is “a set of modes” and how does it relate to actual communities of people? Alti (talk) 10:15, 24 April 2017 (AEST)