I would be worried about a movement that aims to make all of humanity more effective at everything they do.
It seems like the dangerous thing isn’t so much widespread rationality so much as widespread superpowerful tools. Like, to the point that you can easily build a nuke in your garage, or staple together an AI from other available programs.
I’m not too too worried about people getting better at achieving their goals, because most people at least understand that they don’t want to murder their friends and family. Like, a human tasked with eliminating malaria wouldn’t think that killing all life on Earth would be the best way to accomplish that.
I’d prefer for most people to be better at achieving their goals, even if those goals x-risk reduction. Like, it would be nice if churches trying to help raise the quality of life of poor people did better at doing so.
In some areas (like status), everyone getting better at accomplishing their goals would probably be zero-sum, but its already that way. As it stands, I would like for rationalists to increase their status in order to better get things done, but once everyone’s a rationalist I don’t have strong preferences for who has more status than who. So its not that big of a problem.
Some people are often mentioned as wanting destruction (suicide bombers, for instance). Gwern posted a nice article about how ineffective they are, and it seems like becoming more rational (and thus happier, more well-adjusted) would probably draw them out of terrorism.
Gwern posted a nice article about how ineffective they are, and it seems like becoming more rational (and thus happier, more well-adjusted) would probably draw them out of terrorism.
Some people (suicide bombers, for instance) seem to actually want to cause destruction to humanity.
Every movement that employs suicide bombing is highly political and absolutely not nihilist. All of them involve some preexisting milieu of hate and destruction, which provides the moral imperative behind the act. It’s about making the ultimate sacrifice of which a person is capable, not just about causing as much damage as possible.
It seems like the dangerous thing isn’t so much widespread rationality so much as widespread superpowerful tools. Like, to the point that you can easily build a nuke in your garage, or staple together an AI from other available programs.
I’m not too too worried about people getting better at achieving their goals, because most people at least understand that they don’t want to murder their friends and family. Like, a human tasked with eliminating malaria wouldn’t think that killing all life on Earth would be the best way to accomplish that.
I’d prefer for most people to be better at achieving their goals, even if those goals x-risk reduction. Like, it would be nice if churches trying to help raise the quality of life of poor people did better at doing so.
In some areas (like status), everyone getting better at accomplishing their goals would probably be zero-sum, but its already that way. As it stands, I would like for rationalists to increase their status in order to better get things done, but once everyone’s a rationalist I don’t have strong preferences for who has more status than who. So its not that big of a problem.
Some people are often mentioned as wanting destruction (suicide bombers, for instance). Gwern posted a nice article about how ineffective they are, and it seems like becoming more rational (and thus happier, more well-adjusted) would probably draw them out of terrorism.
I would particularly point out the section on how Black September was disbanded: http://www.gwern.net/Terrorism%20is%20not%20about%20Terror#about-the-chicks-man
It’s an amazing and striking anecdote.
Every movement that employs suicide bombing is highly political and absolutely not nihilist. All of them involve some preexisting milieu of hate and destruction, which provides the moral imperative behind the act. It’s about making the ultimate sacrifice of which a person is capable, not just about causing as much damage as possible.
The point stands, but I edited the wording.