I think that it pays to be rationally ignorant. It is an economic fact that the more people specialize, the more they get paid and the chance of making a significant contribution in their particular field increases. You can’t achieve your best in being a doctor if you spend valuable time reading textbooks about Western philosophy or quantum computing instead of reading textbooks about diseases. There is a saying capturing this thought: “jack of all trades and master of none”. Sure, there are some fields such as AI at the intersection of many sciences—however, I doubt that most people on this blog (including me) are capable of handling that much information while producing new results in the field in a reasonable amount of time.
So, instead of reading the intro textbook of each field/science (I bet there are more such fields than anyone can handle in a normal, no-singularity lifespan), the best approach for me is to learn a little about each field in my free time—just enough so that I will not be ignorant to the point of making serious mistakes about the nature of reality, and sufficiently easy on the mind so that I maintain the processing power for the main work: digging as deep as possible into the field of my choice.
So, I disagree with the author and think that Teaching Company courses are more useful than textbooks… except for the textbooks pertaining to your chosen specialty.
There is a real danger in becoming more absorbed with the exploration of rationality and science than with focusing on, and excelling in, your own field. I myself am guilty of this.
If you really want to save lives, you better donate to people who do more than write papers. Aubrey de Grey’s institute might be a better start.
The bottom line is, the SingInst is just a huge money drain. It really doesn’t do anything useful, and all it ever produced is a bunch of papers. It actually does something worse, namely subsidizing a slacker-genius like Yudkowsky, who really should find better uses for his mind than armchair philosophy about “friendly AI” when we don’t even have the knowledge to build an AI with the intelligence of a 10-year-old. Mr. Yudkowsky can actually build not one, but several intelligences greater than probably 95% of humans on the planet—all of them almost guaranteed to be friendly. He simply has to shave that ugly beard, stop being so nerdy and actually meet smart women just like him. His IQ is probably above the Ashkenazi already-high average, so having, say, 10 children and directing each on careers in every field that can potentially eliminate human aging and death will probably do more for humanity than endless philosophizing ever will.
Ditto for the rest of you who know you are smarter than the rest of humanity, but still allow mentally-challenged people to outbreed you, which results in decreasing the proportion of people on this planet whose brains can actually understand science and rationality.