This machine resists Moloch
Jarred Filmer
ha don’t worry it basically is 😄, it’s just that (for me at least) the notion I could put effort into making strong 1-1 connections with people and forming intimate small groups online wasn’t really something that occurred to me to do before I started reading about microsolidarity.
May also be worth noting that the microsolidarity framework is about a bunch of other stuff beyond just crews and case clinics, notably dynamics that come into play once you try to take a bunch of crews and form a larger group of ~150 or so people out of them.
Experimenting with microsolidarity crews
I agree with the content of your comment but the framing gives me a sense of bad faith, and makes me uncomfortable.
If I put a lot of time into a post-mortem detailing how an 8 year project I put a lot into went wrong, and then the top comment was someone summing up what I’d done in an uncharitable paragraph saying things like “making stuff up” and “no shit sherlock” I’d feel like I’d tried to do a good thing for the discourse at large and was defected against.
To echo others, thank you for putting your time and effort into this, I found it coherent and valuable. As an international rat/EA who’s only context for Leverage was Zoe’s post, this fleshed out my understanding of what you were trying to do in a helpful way and gave me a lot to chew on regarding my own thoughts on ideological communities.
Regarding: “Why do people seem to hate us?”
After reading Zoe’s post I had a very negative view of Leverage and Geoff, after some introspection here is my best guess as to why.
Growing up religious, I’m very aware that my individuality can be radically subordinated by:
Potent ideas
Potent community
And when those two are mixed in the form of ideological communities it’s powerful and it’s intoxicating. And like most intoxicating and powerful things, there is a great potential for harm. Thus whenever I find myself moving in a group of people high on ideas and connection, there’s a voice in the back of my mind constantly asking.
Is this a cult yet?
Is this a cult yet?
Is this a cult yet?This is a voice I know for a fact is speaking in the back of several EAs and rationalists I know personally. And I’d be shocked if anyone who’s had a brush with the failure modes of ideological community and finds themsevles in a reading group for The Precipice isn’t thinking that to themselves.
So when I read of an insular group in the memeplex I call home getting high on ideas and each-other, and then started talking about “magic” and “demons” my strong reaction was of fear. It’s happening, kill it with fire before it spreads and ruins everything.
I’m currently agnostic on whether leveraging the power of potent community and ideas is something that can be channeled safely, but I don’t blame you guys for trying; and I recognise that my initial reactions to the topic of Leverage and Geoff Anders were mixed up in with a non-trivial amount of counter productive fear.
A Tree of Light
It gave me an emotional intuition for what more progress along the “distance from violence” scale might look like. If we don’t even have to pull the trigger anymore and can be assured no unintended casualties, maybe it’s more pressure towards the equilibrium of the state relying on violence to govern, and then to suppress the dissent that violence generates with more violence.
I see two independent ideas in this post
Insidious Inception
People communicate thoughts into each others minds
This can be direct *”I do not want to date you”*
Or indirect *”Sorry I’m too busy this week” with no effort to find a different time*
Saying A to indirectly communicate B can:
Obscure an intention that would be obvious were B said directly
Make it harder to refute B, because the idea that A → B needs to first be established
Delicately communicate B without indirectly implying something that would have been implied had you said it directly
Core thoughts
You have ideas that are small and do not effect your base perception of reality, we call this trivia/facts/knowledge.
You have other ideas that are big, and construct your reality in the way that’s hard to appreciate without medication/psychedelics/hippie workshops. We call this worldview/identity/schemas.
Mixed to form a very third idea:
The norms of healthy communication can be especially abused by someone doing this “insidious inception” to add or alter someones “core thoughts”. If someone is doing this to you (deliberately or otherwise), using the norms of healthy communication you use normally to get people to stop doing things you don’t like may not work, and instead make you vulnerable.
Wait hold on, I thought this was a feature of QV that made it well suited to funding public goods 😄? (The more individuals each find the same thing beneficial, the more it must be a “public good” and thus underfunded)
Thanks for your reply :) as in many things, QRI lays out my position on this better than I’m able to 😅
https://www.qualiaresearchinstitute.org/blog/a-primer-on-the-symmetry-theory-of-valence
Love it! I’ve been thinking a lot recently about the role of hedonics in generally intelligent systems. afaik we don’t currently try to induce reward or punishment in any artifically intelligent system we try to build, we simply re-jig it until it produces the output we want. It might be that “re-jigging” does induce a hedonic state, but I see no reason assume it.
I can’t imagine how a meta optimiser might “create from scratch” a state which is intrinsically rewarding or adversive. In our own case I feel evolution must have recruited some property of the universe that was already lying around, something that is just axiomatically motivating to anything concious in the same way that the speed of light is just what is it.
At what age did you start trusting them do things like only crossing at approved intersections?
Out of curiosity, does all of the difference between the value of a child drowning in front of you and a child drowning far away come from uncertainty?
I enjoyed this take https://www.roote.co/wisdom-age
Agree or disagree: “There may be a pattern wherein rationalist types form an insular group to create and apply novel theories of cognition to themselves, and it gets really weird and intense leading to a rash of psychological breaks.”
- Oct 19, 2021, 8:41 PM; 361 points) 's comment on My experience at and around MIRI and CFAR (inspired by Zoe Curzi’s writeup of experiences at Leverage) by (
I empathise with the feeling of slipperyness in the OP, I feel comfortable attributing that to the subject matter rather than malice.
If I had an experience that matched zoe’s to the degree jessicata’s did (superficially or otherwise) I’d feel compelled to post it. I found it helpful in the question of whether “insular rationalist group gets weird and experiences rash of psychotic breaks” is a community problem, or just a problem with stray dude.
Thanks for sharing, I’m about to move into a season of more time for hobby code and this seems like good advice to keep in mind
I’ve never seen that feeling described quite that way, I like it!
Out of curiousity, how do you feel about the proclaimed self evidence of “the cognito”, “I think therefore I am”?
You’re quite welcome 🙂
For existence it’s “I think therefore I am”, just seems like an unavoidable axiom of experience. It feels like wherever I look I’m staring at it.
For conciousness I listened to an 80k hours podcast with David Chalmers on The Hard Problem and ever since then it’s been self evident there’s something that it’s like to be me. It felt like something that had to be factored out of my experience and pointed at for me to see. But it seems as self evident as existing.
For wellbeing and suffering it took some extreme moments for me to start thinking about the fact that some things feel good and bad and that might be like, quite important actually. Also with the realisation that I never decided to find wellbeing good and suffering bad they just are.
For causality I admit it’s not as clear cut, and I only really thought about it yesterday reading this article. But in this moment I’m running an operating system shaped by the past. In that past I experienced the phenomena of prediction and causality. This moment seems no different to that moment so it feels natural to unambiguous act as though this moment will effect the next.
Hmm that last explanation feels much more unwieldy than existence, conciousness, and valence. Perhaps it doesn’t quite deserve the category of self evident, and is more like n+1 induction.
I greatly enjoyed this, thanks for writing it. I matched it to one of the questions in my own personal pantheon of mysteries.
What does it mean for a belief to be self-evident?
It seems self evidently true that I exist, that I am conscious, suffering is bad, wellbeing is good, and the next moment of experience will be the nesesary consequence of this moment.
I can point to the raw justification for these facts in my experience, and I just assume that other people have similar justifications embedded within their subjective perspective. But it’s still an intellectual mystery to me why “it’s self evident” feels like a satisfying justification. As you say maybe that too is self evident ad infinitum
Thank you, I thought so too 😊
And yeah, case clinics have given me a lot of value. If something like it is emerging naturally amoung your friends, then they sound like great friends!
If you do try to expressly instantiate a case clinic with the steps I’d be curious to hear how it goes. I’ve been surprised at the effect setting an explicit format can have on how it feels to be in a group. Something about creating common knowledge on where we’re all supposed to be directing our attention (and with what intention), can be really powerful. Thinking about it now I suppose this is how DnD works 😄