I formed my own opinion at the start, but I didn’t post it right away since I didn’t want to possibly bias other people into agreeing with me. I guess the way I’ll answer this will be slightly different from the other answers, since I think the dynamics of the situation are more complex than an idealisation of vagueness. Pjeby seems hotter(/closer) in estimation when they say it’s a preference for mysterious, prestigious authority, but again I think we have to dive deeper.
I see Ra as a dynamic which tends to occur once an organisation has obtained a certain amount of status. At that point there is an incentive and a temptation to use that status to defend itself against criticism. One way of doing that is providing vague, but extremely positive-sounding non-justifications for the things that it does and use the status to prevent people from digging too deep. This works since there are often social reasons not to ask too many questions. If someone gives a talk, to keep asking followups is to crowd out other people. People will often assume that someone who keeps hammering a point is an ideologue or simply lose interest. In any case, these can usually be answered with additional layers of vagueness.
This also reminds me of the concept of hyperreal or realer than real. Organisations that utilise Ra become a simulation of a great organisation instead of the great organisation that they might have once been. By projecting this image of perfection they feel realer than any real great organisation which will inevitably have its faults and hence inspire doubt.
ISTM that’s a result of worshipping Ra, rather than Ra-worship itself. Perhaps I am biased by my mother’s example, but she was not a part of any mysterious organizations or their status incentives. She merely believed that Church, State, Schools, Companies, or other such Capitalized Entities had mystical powers to which mere human individuals could not aspire, unless they were assimilated into those institutions and thereby earned the blessing of said mystical powers.
AFAICT, this did not come from the type of organizational evolution and incentives that you’re talking about; rather, this was simply a widely-held belief of hers that was largely independent of what competencies or institutions were being discussed. In her mind, ordinary humans couldn’t do jack squat; anything an ordinary human did without an appropriate institutional blessing was merely an exception that didn’t count the same as doing the thing “for real”—it was in her mind the same as an actor pretending to be a priest not being able to actually forgive your sins or perform a marriage ceremony… just extended to everything that institutions or some sort of orthordoxy existed for.
So ISTM that the primary dynamic is that deification of the abstract offers a superstimulus that can’t be matched by real, concrete, imperfect individuals, leading to worship of the abstraction in place of critical thinking or analysis. In effect, my mother was just doing the organizational/societal equivalent of people preferring their anime waifus or surgically-altered pornstars over real-life people. (IOW, removing details that imply imperfection or excess complexity is already a standard route to superstimulus in humans.)
Establishing an institution is a costly signal that there is a group of people committed to spend years of their life working on some issue.
For example, Machine Intelligence Research Institute gives me the hope that if tomorrow Eliezer gets hit by a car, converts to Mormonism, or decides to spend the rest of his life writing fan fiction, the research will go on regardless. Which is a valuable thing.
But if you go along this direction too far, you get superstimuli. If MIRI is better then Eliezer’s blog, then a Global Institute For Everything Importantmust be million times better, and MIRI should be ashamed for competing with them for scarce resources.
Another problem is that creating an institution is a signal of commitment to the agenda, but prolonged existence of the institution is often just a signal of commitment to salaries.
Maybe you should just play along and rename Mind Hackers’ Guild to, dunno, Institute for Mental Modification. Or something less Orwellian. :D
If you want to see Ra in its purest form, look to advertising. It’s positive affect free of information. Olive Garden is not your family; not all who eat Doritos are bold. Ra is a tale told by an idiot, full of sound and fury, signifying nothing. It is also often encountered in celebrities and politics (what is Kim Kardashian famous for, exactly?).
The opposite of Ra is the question “What have you done for me lately?”.
I formed my own opinion at the start, but I didn’t post it right away since I didn’t want to possibly bias other people into agreeing with me. I guess the way I’ll answer this will be slightly different from the other answers, since I think the dynamics of the situation are more complex than an idealisation of vagueness. Pjeby seems hotter(/closer) in estimation when they say it’s a preference for mysterious, prestigious authority, but again I think we have to dive deeper.
I see Ra as a dynamic which tends to occur once an organisation has obtained a certain amount of status. At that point there is an incentive and a temptation to use that status to defend itself against criticism. One way of doing that is providing vague, but extremely positive-sounding non-justifications for the things that it does and use the status to prevent people from digging too deep. This works since there are often social reasons not to ask too many questions. If someone gives a talk, to keep asking followups is to crowd out other people. People will often assume that someone who keeps hammering a point is an ideologue or simply lose interest. In any case, these can usually be answered with additional layers of vagueness.
This also reminds me of the concept of hyperreal or realer than real. Organisations that utilise Ra become a simulation of a great organisation instead of the great organisation that they might have once been. By projecting this image of perfection they feel realer than any real great organisation which will inevitably have its faults and hence inspire doubt.
ISTM that’s a result of worshipping Ra, rather than Ra-worship itself. Perhaps I am biased by my mother’s example, but she was not a part of any mysterious organizations or their status incentives. She merely believed that Church, State, Schools, Companies, or other such Capitalized Entities had mystical powers to which mere human individuals could not aspire, unless they were assimilated into those institutions and thereby earned the blessing of said mystical powers.
AFAICT, this did not come from the type of organizational evolution and incentives that you’re talking about; rather, this was simply a widely-held belief of hers that was largely independent of what competencies or institutions were being discussed. In her mind, ordinary humans couldn’t do jack squat; anything an ordinary human did without an appropriate institutional blessing was merely an exception that didn’t count the same as doing the thing “for real”—it was in her mind the same as an actor pretending to be a priest not being able to actually forgive your sins or perform a marriage ceremony… just extended to everything that institutions or some sort of orthordoxy existed for.
So ISTM that the primary dynamic is that deification of the abstract offers a superstimulus that can’t be matched by real, concrete, imperfect individuals, leading to worship of the abstraction in place of critical thinking or analysis. In effect, my mother was just doing the organizational/societal equivalent of people preferring their anime waifus or surgically-altered pornstars over real-life people. (IOW, removing details that imply imperfection or excess complexity is already a standard route to superstimulus in humans.)
Maybe I should have said that there two sides to Ra—the institutional incentive and the reason why people fall for this or (stronger) want this
Establishing an institution is a costly signal that there is a group of people committed to spend years of their life working on some issue.
For example, Machine Intelligence Research Institute gives me the hope that if tomorrow Eliezer gets hit by a car, converts to Mormonism, or decides to spend the rest of his life writing fan fiction, the research will go on regardless. Which is a valuable thing.
But if you go along this direction too far, you get superstimuli. If MIRI is better then Eliezer’s blog, then a Global Institute For Everything Important must be million times better, and MIRI should be ashamed for competing with them for scarce resources.
Another problem is that creating an institution is a signal of commitment to the agenda, but prolonged existence of the institution is often just a signal of commitment to salaries.
Maybe you should just play along and rename Mind Hackers’ Guild to, dunno, Institute for Mental Modification. Or something less Orwellian. :D
If you want to see Ra in its purest form, look to advertising. It’s positive affect free of information. Olive Garden is not your family; not all who eat Doritos are bold. Ra is a tale told by an idiot, full of sound and fury, signifying nothing. It is also often encountered in celebrities and politics (what is Kim Kardashian famous for, exactly?).
The opposite of Ra is the question “What have you done for me lately?”.