Sarah Constantine coined Ra in this blog post and it has become a frequently referenced concept within the Rational-sphere. However, I don’t feel she provided a clear definition. How would you define it?
[Question] What is Ra?
- Politics is way too meta by 17 Mar 2021 7:04 UTC; 288 points) (
- Politics is far too meta by 17 Mar 2021 23:57 UTC; 58 points) (EA Forum;
- 12 Nov 2022 20:00 UTC; 38 points) 's comment on IMPCO, don’t injure yourself by returning FTXFF money for services you already provided by (EA Forum;
I thought the article provided a pretty clear definition: i.e., a preference for Mysterious, Prestigious, collective Authority over known, functional, individual capability.
Thank you for posting this, btw, because I hadn’t actually heard of it before, and reading the article allowed me to finally make sense of a way that my mother treated me as a child, that I couldn’t get my head around before. (Because it just seemed like she was demeaning me and my abilities personally, rather than simply having a semi-religious belief that no mere individual imperfect human could ever do something meaningful through their own actions, rather than through the divine authority of proper institutions.)
Oddly enough, I was actually trying to change a belief I’d picked up from her (that I can’t do anything meaningful or important “for real”) when I had the impulse to go look at LW and spotted your question, then read the article. It was just what I needed to wrap my head around the belief and remove it so that I don’t get nervous when I get close to doing something meaningful “for real”.
Indeed, what I found was that while I hadn’t fully internalized her belief in Ra, I effectively picked up as a background assumption the idea that only certain blessed people are allowed to market themselves successfully or succeed in business in a big way, or write Proper Books… and that I’m not one of them.
So even though I am about as anti-Ra in philosophy as they get, I still had a Ra-like alief that made me feel inadequate compared to the Mysterious Authorities when I tried writing books or promoting my work too effectively. (Versus the very ineffective methods of doing both that I’ve been doing for the past 14 years.) I’m very much looking forward to see what I can do when I don’t have Ra-induced feelings of inadequacy dogging my steps.
Great to hear that this article helped you
Ra is an emotional drive to idealize vagueness and despise clarity. It is a psychological mindset rather than rational self-interest; from inside, this cognitive corruption feels inherently desirable rather than merely useful.
Institutions become corrupted this way, as a result of people in positions of power exhibiting the same kind of bias. It is not a conspiracy, just a natural outcome of many people having the same preferences. It is not conformity, because those preferences already pointed in the specific direction. (The people would have the same preference even if it were a minority preference, although social approval probably makes them indulge in it more than they would have otherwise.)
This attitude is culturally coded as upper-class, probably because working-class people need to do specific tasks and receive direct feedback if they get an important detail wrong, while upper-class people can afford to be vague and delegate all details to their inferiors. (Also, people higher in hierarchy are often shielded from the consequences of mistakes, which further reduces their incentives to understand the details. Thus the mistakes can freely grow to the level when they start interfering with the primary purpose of the institution. Even then the behavior is difficult to stop, because it is so distributed that firing a few key people would achieve no substantial change. And the people in positions to do the firing usually share the same attitude, so they couldn’t correctly diagnose it as a source of the problem. But Ra is not limited to the domain of business.)
From inside, Ra means perceiving a mysterious perfection, which is awesome by being awesome. It has the generic markers of success, but nothing knowable beyond that. (If you can say that some thing is awesome because it does some specific X, that makes the thing less Ra.)
For example, an archetypally Ra corporation would be perceived as having lots of money and influence, and hiring the smartest and most competent people in the world, but you wouldn’t know what it actually does, other than it is an important player in finance or technology or something similar. (Obviously, there must be someone in the corporation, perhaps the CEO, who has a better picture of what the corporation is actually doing. But that is only possible because the person is also Ra. It is not possible to fully comprehend for an average mortal such as you.)
The famous Ra advertising template is: “X1. More than X.” (It is important that you don’t know how specifically it is “more” than the competing X’s, which implies it contains more Ra.)
The Virtue of Narrowness was written as an antidote against our natural tendencies towards Ra.
When people become attached to something that in their eyes embodies Ra, they are very frustrated about those who challenge their attitude. (“What horrible mental flaw could make this evil person criticize the awesomeness itself?” To them, disrespecting Ra does not feel like kicking a puppy, but rather like an attempt to remove all the puppy-ness from the universe, forever.) The frustrating behaviors include not only actively opposing the thing, but also ignoring it (an attack on its omni-importance), or trying to analyze it (an attack on its mysteriousness).
People under strong influence of Ra hate: being specific; communicating clearly, being authentic, exposing your preferences, and generally exposing anything about yourself. (If specific things about you are known, you cannot become Ra. You are stupid for throwing away this opportunity, and you are hostile if you try to make me to do the same.) From the opposite perspective, authenticity and specificity are antidotes to Ra.
Seems to me that Ra is a desire to “become stronger” without any respect for the “merely real” and lots of wishful thinking. A superstimulus that makes the actual good feel like a pathetic failure.
(Tried to summarize the key parts of the original article, and add my own interpretation. It is not exactly a definition—maybe the first paragraph could be considered one—but at least it’s shorter.)
From a January 2017 Facebook conversation:
I formed my own opinion at the start, but I didn’t post it right away since I didn’t want to possibly bias other people into agreeing with me. I guess the way I’ll answer this will be slightly different from the other answers, since I think the dynamics of the situation are more complex than an idealisation of vagueness. Pjeby seems hotter(/closer) in estimation when they say it’s a preference for mysterious, prestigious authority, but again I think we have to dive deeper.
I see Ra as a dynamic which tends to occur once an organisation has obtained a certain amount of status. At that point there is an incentive and a temptation to use that status to defend itself against criticism. One way of doing that is providing vague, but extremely positive-sounding non-justifications for the things that it does and use the status to prevent people from digging too deep. This works since there are often social reasons not to ask too many questions. If someone gives a talk, to keep asking followups is to crowd out other people. People will often assume that someone who keeps hammering a point is an ideologue or simply lose interest. In any case, these can usually be answered with additional layers of vagueness.
This also reminds me of the concept of hyperreal or realer than real. Organisations that utilise Ra become a simulation of a great organisation instead of the great organisation that they might have once been. By projecting this image of perfection they feel realer than any real great organisation which will inevitably have its faults and hence inspire doubt.
ISTM that’s a result of worshipping Ra, rather than Ra-worship itself. Perhaps I am biased by my mother’s example, but she was not a part of any mysterious organizations or their status incentives. She merely believed that Church, State, Schools, Companies, or other such Capitalized Entities had mystical powers to which mere human individuals could not aspire, unless they were assimilated into those institutions and thereby earned the blessing of said mystical powers.
AFAICT, this did not come from the type of organizational evolution and incentives that you’re talking about; rather, this was simply a widely-held belief of hers that was largely independent of what competencies or institutions were being discussed. In her mind, ordinary humans couldn’t do jack squat; anything an ordinary human did without an appropriate institutional blessing was merely an exception that didn’t count the same as doing the thing “for real”—it was in her mind the same as an actor pretending to be a priest not being able to actually forgive your sins or perform a marriage ceremony… just extended to everything that institutions or some sort of orthordoxy existed for.
So ISTM that the primary dynamic is that deification of the abstract offers a superstimulus that can’t be matched by real, concrete, imperfect individuals, leading to worship of the abstraction in place of critical thinking or analysis. In effect, my mother was just doing the organizational/societal equivalent of people preferring their anime waifus or surgically-altered pornstars over real-life people. (IOW, removing details that imply imperfection or excess complexity is already a standard route to superstimulus in humans.)
Maybe I should have said that there two sides to Ra—the institutional incentive and the reason why people fall for this or (stronger) want this
Establishing an institution is a costly signal that there is a group of people committed to spend years of their life working on some issue.
For example, Machine Intelligence Research Institute gives me the hope that if tomorrow Eliezer gets hit by a car, converts to Mormonism, or decides to spend the rest of his life writing fan fiction, the research will go on regardless. Which is a valuable thing.
But if you go along this direction too far, you get superstimuli. If MIRI is better then Eliezer’s blog, then a Global Institute For Everything Important must be million times better, and MIRI should be ashamed for competing with them for scarce resources.
Another problem is that creating an institution is a signal of commitment to the agenda, but prolonged existence of the institution is often just a signal of commitment to salaries.
Maybe you should just play along and rename Mind Hackers’ Guild to, dunno, Institute for Mental Modification. Or something less Orwellian. :D
If you want to see Ra in its purest form, look to advertising. It’s positive affect free of information. Olive Garden is not your family; not all who eat Doritos are bold. Ra is a tale told by an idiot, full of sound and fury, signifying nothing. It is also often encountered in celebrities and politics (what is Kim Kardashian famous for, exactly?).
The opposite of Ra is the question “What have you done for me lately?”.
Over time, the concept of Ra settled in my head as… the spirit of collective narcissism, where we must recognize narcissism as delusional striving towards attaining the impossible social security of being completely beyond criticism, to be flawless, perfect, unimprovable, to pursue Good Optics with such abandon, as to mostly lose sight of whatever it was you were running from.
It leads to not being able to admit to most of the org’s imperfections even internally, though they may admit to that imprefection internally, doing so resigns them to it, and they submit to it.
I don’t like to define it as the celebration of vagueness, in my definition that’s just an entailment. Something narcissism tends to do, to hide.