I found this a very useful post. It feels like a key piece in helping me think about CFAR, but also it sharpens my own sense of what stuff in “rationality” feels important to me. Namely “Helping people not have worse lives after interacting with rationalist memes”
I see. I guess that framing feels slightly off to me—maybe this is what you meant or maybe we have a disagreement—but I would say “Helping people not have worse lives after interacting with <a weird but true idea>”.
Like I think that similar disorienting things would happen if someone really tried to incorporate PG’s “Black Swan Farming” into your action space, and indeed many good startup founders have weird lives with weird tradeoffs relative to normal people that often leads to burnout. “Interacting with x-risk” or “Interacting with the heavy-tailed nature of reality” or “Interacting with AGI” or whatever. Oftentimes stuff humans have only been interacting with in the last 300 years, or in some cases 50 years.
It might be useful to know that I’m not that sold on a lot of singularity stuff, and the parts of rationality that have affected me the most are some of the more general thinking principles. “Look at the truth even if it hurts” / “Understanding tiny amounts of evo and evo psyche ideas” / “Here’s 18 different biases, now you can tear down most people’s arguments”.
It was those ideas (a mix of the naive and sophisticated form of them) + my own idiosyncrasies that caused me a lot of trouble. So that’s why I say “rationalist memes”. I guess that if I bought more singularity stuff I might frame it as “weird but true ideas”.
I found this a very useful post. It feels like a key piece in helping me think about CFAR, but also it sharpens my own sense of what stuff in “rationality” feels important to me. Namely “Helping people not have worse lives after interacting with rationalist memes”
I see. I guess that framing feels slightly off to me—maybe this is what you meant or maybe we have a disagreement—but I would say “Helping people not have worse lives after interacting with <a weird but true idea>”.
Like I think that similar disorienting things would happen if someone really tried to incorporate PG’s “Black Swan Farming” into your action space, and indeed many good startup founders have weird lives with weird tradeoffs relative to normal people that often leads to burnout. “Interacting with x-risk” or “Interacting with the heavy-tailed nature of reality” or “Interacting with AGI” or whatever. Oftentimes stuff humans have only been interacting with in the last 300 years, or in some cases 50 years.
It might be useful to know that I’m not that sold on a lot of singularity stuff, and the parts of rationality that have affected me the most are some of the more general thinking principles. “Look at the truth even if it hurts” / “Understanding tiny amounts of evo and evo psyche ideas” / “Here’s 18 different biases, now you can tear down most people’s arguments”.
It was those ideas (a mix of the naive and sophisticated form of them) + my own idiosyncrasies that caused me a lot of trouble. So that’s why I say “rationalist memes”. I guess that if I bought more singularity stuff I might frame it as “weird but true ideas”.