Rob B: I gather Ra is to a first approximation just ‘the sense that things are impersonally respectable / objective / authoritative / credible / prestigious / etc. based only on superficial indirect indicators of excellence.’
Ruby B: I too feel like I do not understand Ra. [...] Moloch, in my mind, was very clearly defined. For any given thing, I could tell you confidently whether it was Moloch or not. I can’t do that with Ra. Also, Moloch is a single clear concept while Ra seems to be a vague cluster if it’s anything. [...]
Rob B: Is there anything confusing or off about the idea that Ra is ‘respectability and prestige maintained via surface-level correlates of useful/valuable things that are not themselves useful/valuable (in the context at hand)’? Either for making sense of Sarah’s post or for applying the concept to real-world phenomena?
Ruby B: Yes, there is something off about that summary since the original post seems to contain a lot more than “seeking prestige via optimizing for correlates of value than actual value”. [...] If your summary is at the heart of it, there are some links missing to the “hates introspection”, “defends itself with vagueness, confusion, incoherence.” [...]
Rob B: There are two ideas here:
(1) “a drive to seek prestige by optimizing for correlates of value that aren’t themselves valuable”
The connection between these two ideas is this paragraph in Sarah’s essay:
”‘Respectability’ turns out to be incoherent quite often — i.e. if you have any consistent model of the world you often have to take extreme or novel positions as a logical conclusion from your assumptions. To Ra, disrespectability is damnation, and thus consistent thought is suspect.”
1 is the core idea that Sarah wants to point to when she says “Ra”. 2 is a particular phenomenon that Sarah claims Ra tends to cause (though obviously lots of other things can cause fuzzy/inconsistent thinking too, and a drive toward such). Specifically, Sarah is defining Ra as 1, and then making the empirical claim that this is a commonplace drive, that pursuing any practical or intellectual project sufficiently consistently will at least occasionally require one to either sacrifice epistemics or sacrifice prestige, and that the drive is powerful enough that a lot of people do end up sacrificing epistemics when that conflict arises.
Ruby B: Okay, yeah, I can start to see that. Thanks for making it clearer to me, Rob!
Rob B: I think Sarah’s essay is useful and coherent, but weirdly structured: she writes a bunch of poetry and mentions a bunch of accidental (and metaphorical, synesthetic, etc.) properties of Ra before she starts to delve into Ra’s essential properties. I think part of why I didn’t find it confusing was that I skimmed the early sections and got to the later parts of the essay that were more speaking-to-the-heart-of-the-issue, then read it back in reverse order. :P So I got to relatively clear things like the Horus (/ manifest usefulness / value / prestige-for-good-reasons) vs. Ra (empty respectability / shallow indicators of value / prestige-based-on-superficial-correlates-of-excellence) contrast first:
”Horus likes organization, clarity, intelligence, money, excellence, and power — and these things are genuinely valuable. If you want to accomplish big goals, it is perfectly rational to seek them, because they’re force multipliers. Pursuit of force multipliers — that is, pursuit of power — is not inherently Ra. There is nothing Ra-like, for instance, about noticing that software is a fully general force multiplier and trying to invest in or make better software. Ra comes in when you start admiring force multipliers for no specific goal, just because they’re shiny.”
And:
”When someone is willing to work for prestige, but not merely for money or intrinsic interest, they’re being influenced by Ra. The love of prestige is not only about seeking ‘status’ (as it cashes out to things like high quality of life, admiration, sex), but about trying to be an insider within a prestigious institution.”
(One of the key claims Sarah makes about respectability and prestige maintained via surface-level correlates of useful/valuable things that are not themselves useful/valuable (/ Ra) is that this kind of respectability accrues much more readily to institutions, organizations, and abstractions than to individuals. Thus a lot of the post is about how idealized abstractions and austere institutions trigger this lost-purposes-of-prestige mindset more readily, which I gather is because it’s harder to idealize something concrete and tangible and weak, like an individual person. Or maybe it has to do with the fact that it’s harder to concretely visualize the proper function and work of something that’s more abstract and large-scale, so it’s easier to lose sight of the rationale for what you’re seeing?)
”Seen through Ra-goggles, giving money to some particular man to spend on the causes he thinks best is weird and disturbing; putting money into a foundation, to exist in perpetuity, is respectable and appropriate. The impression that it is run collectively, by ‘the institution’ rather than any individual persons, makes it seem more Ra-like, and therefore more appealing.”
All of that stuff makes sense. The earlier stuff from the first 2 sections of the post doesn’t illuminate much, I think, unless you already have a more specific sense of what Sarah means by “Ra” from the later sections.
Ruby B: Your restructuring and rephrasing is vastly more comprehensible. That said, poetry and poetic imagery is nice and I don’t begrudge Sarah her attempt.
And given your explanation, perhaps your summary description could be made slightly more comprehensive (though less comprehensible) like so:
“Ra is a drive to seek prestige by optimizing for correlates of value that aren’t themselves valuable because you have forgotten the point of the correlates was to attain actual value.” [...]
Rob B: Maybe “Ra is a drive to seek prestige by optimizing for correlates of value, in contexts where the correlates are not themselves valuable but this fact is made non-obvious by the correlate’s abstract/impersonal/far-mode-evoking nature”?
From a January 2017 Facebook conversation: