The way you use the word Moloch makes me feel like it is an attempt to invoke a vague miasma of dread. If your intention was to coherently point at a cluster of concepts or behaviors, I’d recommend you use less flavorful terms, such as “inadequate stable equilibria”, “zero-sum behavior that spreads like cancer”, “parasitism and predation”. Of course, these three terms are also vague and I would recommend using examples to communicate exactly what you are pointing at, but they are still less vague than Moloch. At a higher level I recommend looking at some of Duncan Sabien’s posts for how to communicate abstract concepts from a sociological perspective.
I’ve been investigating “Tuning Your Cognitive Strategies” off and on since November 2023, and I agree that it is interesting enough to be worth a greater investment in research efforts (including my own), but I believe that there are other skills of rationality that may be significantly more useful for people trying to save the world. Kaj Sotala’s Multiagent sequence in my opinion is probably the one rationality research direction I think has the highest potential impact in enabling people in our community to do the things they want to do.
The “Why our kind cannot cooperate” sequence, as far as I remember, is focused on what seem to be irrationalities-based failures of cooperation in our community. Stuff like mistaking contrarianism with being smart and high status, et cetera. I disagree with your attempt at using it as a reasoning to claim that the “bad guys” are predisposed to “victory”.
If I was focused on furthering co-ordination, I’d take a step back and actually try to further co-ordination and see what issues I face. I’d try to build a small research team focused on a research project and see what irrational behavior and incentives I notice, and try to figure out systemic fixes. I’d try to create simple game theoretic models of interactions between people working towards making something happen and see what issues may arise.
I think CFAR was recently funding projects focused on furthering group rationality. You should contact CFAR, talk to some people thinking about this.
Miscellaneous thoughts:
The way you use the word Moloch makes me feel like it is an attempt to invoke a vague miasma of dread. If your intention was to coherently point at a cluster of concepts or behaviors, I’d recommend you use less flavorful terms, such as “inadequate stable equilibria”, “zero-sum behavior that spreads like cancer”, “parasitism and predation”. Of course, these three terms are also vague and I would recommend using examples to communicate exactly what you are pointing at, but they are still less vague than Moloch. At a higher level I recommend looking at some of Duncan Sabien’s posts for how to communicate abstract concepts from a sociological perspective.
I’ve been investigating “Tuning Your Cognitive Strategies” off and on since November 2023, and I agree that it is interesting enough to be worth a greater investment in research efforts (including my own), but I believe that there are other skills of rationality that may be significantly more useful for people trying to save the world. Kaj Sotala’s Multiagent sequence in my opinion is probably the one rationality research direction I think has the highest potential impact in enabling people in our community to do the things they want to do.
The “Why our kind cannot cooperate” sequence, as far as I remember, is focused on what seem to be irrationalities-based failures of cooperation in our community. Stuff like mistaking contrarianism with being smart and high status, et cetera. I disagree with your attempt at using it as a reasoning to claim that the “bad guys” are predisposed to “victory”.
If I was focused on furthering co-ordination, I’d take a step back and actually try to further co-ordination and see what issues I face. I’d try to build a small research team focused on a research project and see what irrational behavior and incentives I notice, and try to figure out systemic fixes. I’d try to create simple game theoretic models of interactions between people working towards making something happen and see what issues may arise.
I think CFAR was recently funding projects focused on furthering group rationality. You should contact CFAR, talk to some people thinking about this.
Strong upvoted. I read this lightly as I am currently pulling an all-nighter, and will read it more deeply and give a proper response in 24-36 hours.