Ra is an emotional drive to idealize vagueness and despise clarity. It is a psychological mindset rather than rational self-interest; from inside, this cognitive corruption feels inherently desirable rather than merely useful.
Institutions become corrupted this way, as a result of people in positions of power exhibiting the same kind of bias. It is not a conspiracy, just a natural outcome of many people having the same preferences. It is not conformity, because those preferences already pointed in the specific direction. (The people would have the same preference even if it were a minority preference, although social approval probably makes them indulge in it more than they would have otherwise.)
This attitude is culturally coded as upper-class, probably because working-class people need to do specific tasks and receive direct feedback if they get an important detail wrong, while upper-class people can afford to be vague and delegate all details to their inferiors. (Also, people higher in hierarchy are often shielded from the consequences of mistakes, which further reduces their incentives to understand the details. Thus the mistakes can freely grow to the level when they start interfering with the primary purpose of the institution. Even then the behavior is difficult to stop, because it is so distributed that firing a few key people would achieve no substantial change. And the people in positions to do the firing usually share the same attitude, so they couldn’t correctly diagnose it as a source of the problem. But Ra is not limited to the domain of business.)
From inside, Ra means perceiving a mysterious perfection, which is awesome by being awesome. It has the generic markers of success, but nothing knowable beyond that. (If you can say that some thing is awesome because it does some specific X, that makes the thing less Ra.)
For example, an archetypally Ra corporation would be perceived as having lots of money and influence, and hiring the smartest and most competent people in the world, but you wouldn’t know what it actually does, other than it is an important player in finance or technology or something similar. (Obviously, there must be someone in the corporation, perhaps the CEO, who has a better picture of what the corporation is actually doing. But that is only possible because the person is also Ra. It is not possible to fully comprehend for an average mortal such as you.)
The famous Ra advertising template is: “X1. More than X.” (It is important that you don’t know how specifically it is “more” than the competing X’s, which implies it contains more Ra.)
When people become attached to something that in their eyes embodies Ra, they are very frustrated about those who challenge their attitude. (“What horrible mental flaw could make this evil person criticize the awesomeness itself?” To them, disrespecting Ra does not feel like kicking a puppy, but rather like an attempt to remove all the puppy-ness from the universe, forever.) The frustrating behaviors include not only actively opposing the thing, but also ignoring it (an attack on its omni-importance), or trying to analyze it (an attack on its mysteriousness).
People under strong influence of Ra hate: being specific; communicating clearly, being authentic, exposing your preferences, and generally exposing anything about yourself. (If specific things about you are known, you cannot become Ra. You are stupid for throwing away this opportunity, and you are hostile if you try to make me to do the same.) From the opposite perspective, authenticity and specificity are antidotes to Ra.
Seems to me that Ra is a desire to “become stronger” without any respect for the “merely real” and lots of wishful thinking. A superstimulus that makes the actual good feel like a pathetic failure.
(Tried to summarize the key parts of the original article, and add my own interpretation. It is not exactly a definition—maybe the first paragraph could be considered one—but at least it’s shorter.)
Ra is an emotional drive to idealize vagueness and despise clarity. It is a psychological mindset rather than rational self-interest; from inside, this cognitive corruption feels inherently desirable rather than merely useful.
Institutions become corrupted this way, as a result of people in positions of power exhibiting the same kind of bias. It is not a conspiracy, just a natural outcome of many people having the same preferences. It is not conformity, because those preferences already pointed in the specific direction. (The people would have the same preference even if it were a minority preference, although social approval probably makes them indulge in it more than they would have otherwise.)
This attitude is culturally coded as upper-class, probably because working-class people need to do specific tasks and receive direct feedback if they get an important detail wrong, while upper-class people can afford to be vague and delegate all details to their inferiors. (Also, people higher in hierarchy are often shielded from the consequences of mistakes, which further reduces their incentives to understand the details. Thus the mistakes can freely grow to the level when they start interfering with the primary purpose of the institution. Even then the behavior is difficult to stop, because it is so distributed that firing a few key people would achieve no substantial change. And the people in positions to do the firing usually share the same attitude, so they couldn’t correctly diagnose it as a source of the problem. But Ra is not limited to the domain of business.)
From inside, Ra means perceiving a mysterious perfection, which is awesome by being awesome. It has the generic markers of success, but nothing knowable beyond that. (If you can say that some thing is awesome because it does some specific X, that makes the thing less Ra.)
For example, an archetypally Ra corporation would be perceived as having lots of money and influence, and hiring the smartest and most competent people in the world, but you wouldn’t know what it actually does, other than it is an important player in finance or technology or something similar. (Obviously, there must be someone in the corporation, perhaps the CEO, who has a better picture of what the corporation is actually doing. But that is only possible because the person is also Ra. It is not possible to fully comprehend for an average mortal such as you.)
The famous Ra advertising template is: “X1. More than X.” (It is important that you don’t know how specifically it is “more” than the competing X’s, which implies it contains more Ra.)
The Virtue of Narrowness was written as an antidote against our natural tendencies towards Ra.
When people become attached to something that in their eyes embodies Ra, they are very frustrated about those who challenge their attitude. (“What horrible mental flaw could make this evil person criticize the awesomeness itself?” To them, disrespecting Ra does not feel like kicking a puppy, but rather like an attempt to remove all the puppy-ness from the universe, forever.) The frustrating behaviors include not only actively opposing the thing, but also ignoring it (an attack on its omni-importance), or trying to analyze it (an attack on its mysteriousness).
People under strong influence of Ra hate: being specific; communicating clearly, being authentic, exposing your preferences, and generally exposing anything about yourself. (If specific things about you are known, you cannot become Ra. You are stupid for throwing away this opportunity, and you are hostile if you try to make me to do the same.) From the opposite perspective, authenticity and specificity are antidotes to Ra.
Seems to me that Ra is a desire to “become stronger” without any respect for the “merely real” and lots of wishful thinking. A superstimulus that makes the actual good feel like a pathetic failure.
(Tried to summarize the key parts of the original article, and add my own interpretation. It is not exactly a definition—maybe the first paragraph could be considered one—but at least it’s shorter.)