To be more specific, I don’t think that writing a book on rationality is worth the time it takes to do so when it is written by one of a few people who might be capable of formalizing some important problems. Especially since there are already many books on rationality. Even if Eliezer Yudkowsky is able to put everything the world knows about rationality together in a concise manner, that’s nothing that will impress the important academics enough to actually believe him on AI issues.
Rationality is probably a moderately important factor in planetary collective intelligence. Pinker claims that rational thinking + game theory have also contributed to recent positive moral shifts. Though there are some existing books on the topic, it could well be an area where a relatively small effort could produce a big positive result.
However, I’m not entirely convinced that hpmor.com is the best way to go about it...
There was a list of problems posted recently:
Rationality is probably a moderately important factor in planetary collective intelligence. Pinker claims that rational thinking + game theory have also contributed to recent positive moral shifts. Though there are some existing books on the topic, it could well be an area where a relatively small effort could produce a big positive result.
However, I’m not entirely convinced that hpmor.com is the best way to go about it...
It turns out that HPMOR has been great for SI recruiting and networking. IMO winners apparently read HPMOR. So do an absurd number of Googlers.