Believing what feels good is evolutionarily adaptive in the sense arriving at correct conclusions about whether God or Singularities exist won’t help much if it makes your tribesmates dislike you. This bias is a cumulative, recursive problem that stacks up over the thousands or millions of cognitive acts that go into our beliefs about what we should care about.
And this gets a lot worse when it’s combined with our sharp cognitive limitations.
We seem to have roughly the least cognitive capacity that still lets a species as a whole very slowly invent and build technologies.
We are idiots, every last one of us. Rationalists with tons of knowledge are a bit less idiotic, but let’s don’t get cocky—we’re still monkey-brained idiots. We just don’t have the cognitive horsepower to do the Bayesian math on all the relevant evidence, because important topics are complex. And we’re resistant but far from immune to motivated reasoning: you’ve got to really love rationalism to enjoy being proven wrong and so not turning away cognitively it when it happens.
What I take from all this is that humans are nobly struggling against our own cognitive limitations. We should try harder in the face of rationality being challenging. Success is possible, just not easy and never certain. And very few people are really bad; they’re just deluded.
To your exact question:
Musk believes in an intelligence explosion. He cares a lot about the culture war because, roughly as he puts it, he’s addicted to drama. I don’t know about Thiel.
Most of humanity does not believe in an intelligence explosion happening soon. So actually people who both believe in a singularity and still care about culture wars are quite rare.
I do wonder why people downvoted this quite reasonable question. I suspect they’re well-meaning monkey-brained idiots, just like the rest of us.
I think the sad part is although these people are quite rare, they actually represent a big share of singularity believers’ potential influence. e.g. Elon Musk alone has a net worth of $400 billion, while worldwide AI safety spending is between $0.1 and $0.2 billion/year.
If the story of humanity was put in a novel, it might be one of those novels which feel quite sour. There’s not even a great battle where the good guys organized themselves and did their best and lost honorably.
I disagree. There is such a battle. It is happening right now, in this very conversation. The rationalist X-risk community is the good guys, and we will be joined by more as we organize. We are organizing right now, and already fighting aspects of that battle. It won’t be fought with weapons but ideas. We are honing our ideas and working out goals and strategies. When we figure out what to do in the public, we will fight to get that done. We are already fighting to figure out alignment of AGI, and starting to work on alignment of humans to meet that challenge.
It’s a shame Musk hasn’t joined up, but in most good stories, the good guys are the underdogs anyway.
Now, I’d much rather live in dull than exciting times. But here we are. Time to fight. The main enemy is our collective monkey-brained idiocy.
On second thought, just because I don’t see the struggle doesn’t mean there is none. Maybe someday in the future we’ll learn the real story, and it’ll will turn out beautiful with lots of meaningful spirit and passion.
Because people aren’t rational. Motivated reasoning is a big factor but also we’re all trying to think using monkey brains.
Believing what feels good is evolutionarily adaptive in the sense arriving at correct conclusions about whether God or Singularities exist won’t help much if it makes your tribesmates dislike you. This bias is a cumulative, recursive problem that stacks up over the thousands or millions of cognitive acts that go into our beliefs about what we should care about.
And this gets a lot worse when it’s combined with our sharp cognitive limitations. We seem to have roughly the least cognitive capacity that still lets a species as a whole very slowly invent and build technologies.
We are idiots, every last one of us. Rationalists with tons of knowledge are a bit less idiotic, but let’s don’t get cocky—we’re still monkey-brained idiots. We just don’t have the cognitive horsepower to do the Bayesian math on all the relevant evidence, because important topics are complex. And we’re resistant but far from immune to motivated reasoning: you’ve got to really love rationalism to enjoy being proven wrong and so not turning away cognitively it when it happens.
What I take from all this is that humans are nobly struggling against our own cognitive limitations. We should try harder in the face of rationality being challenging. Success is possible, just not easy and never certain. And very few people are really bad; they’re just deluded.
To your exact question:
Musk believes in an intelligence explosion. He cares a lot about the culture war because, roughly as he puts it, he’s addicted to drama. I don’t know about Thiel.
Most of humanity does not believe in an intelligence explosion happening soon. So actually people who both believe in a singularity and still care about culture wars are quite rare.
I do wonder why people downvoted this quite reasonable question. I suspect they’re well-meaning monkey-brained idiots, just like the rest of us.
I think the sad part is although these people are quite rare, they actually represent a big share of singularity believers’ potential influence. e.g. Elon Musk alone has a net worth of $400 billion, while worldwide AI safety spending is between $0.1 and $0.2 billion/year.
If the story of humanity was put in a novel, it might be one of those novels which feel quite sour. There’s not even a great battle where the good guys organized themselves and did their best and lost honorably.
I disagree. There is such a battle. It is happening right now, in this very conversation. The rationalist X-risk community is the good guys, and we will be joined by more as we organize. We are organizing right now, and already fighting aspects of that battle. It won’t be fought with weapons but ideas. We are honing our ideas and working out goals and strategies. When we figure out what to do in the public, we will fight to get that done. We are already fighting to figure out alignment of AGI, and starting to work on alignment of humans to meet that challenge.
It’s a shame Musk hasn’t joined up, but in most good stories, the good guys are the underdogs anyway.
Now, I’d much rather live in dull than exciting times. But here we are. Time to fight. The main enemy is our collective monkey-brained idiocy.
Join the fight!
:) that’s a better attitude. You’re very right.
On second thought, just because I don’t see the struggle doesn’t mean there is none. Maybe someday in the future we’ll learn the real story, and it’ll will turn out beautiful with lots of meaningful spirit and passion.
Thank you for mentioning this.