I agree with this. Look at the list of SIAI’s publications—not all of them would have required a math genius to write.
I think that some of the most important papers relating to the Singularity so far have been (in no particular order) Eliezer’s CEV proposal, Omohundro’s Basic AI Drives, Robin Hanson’s If uploads come first, Carl Shulman’s Whole brain emulations and the evolution of superorganisms, Eliezer’s Artificial Intelligence as a Positive and Negative Factor in Global Risk, Anders Sandberg’s and Nick Bostrom’s Whole brain emulation roadmap, lukeprog and Louie Helm’s The Singularity and Machine Ethics, and the intelligence explosion paper that lukeprog and Anna Salamon are writing right now. I’d also like to imagine that my two draft papers might have some value. None of those would have required a math genius for a writer—in fact, the WBE roadmap is probably the only one that required any math knowledge at all. Of possible future directions, many of the Singularity and Machine Ethics proposals can be done without being a math genius as well.
Then there are various other useful paths that are less directly associated with research. Popular writing (the career path that I’m thinking about concentrating on) might inspire countless of people to pursue FAI-related pursuits if done well. (Or turn them away from it, if done badly.)
Note that the “it’s better to earn money to fund others to do research” presumption assumes that there are people who can be hired to do research. If everyone who’s interested in FAI/Singularity issues and isn’t a math genius decides to just earn money instead, that means that the only people who can be hired to do work on FAI/Singularity issues are either math geniuses or folks who’d rather be doing something else but agree to do this because it pays the bills. It’d be better to have math geniuses and genuinely motivated folks who were ready to do the tasks that weren’t the math geniuses’ comparative advantage.
I agree with this. Look at the list of SIAI’s publications—not all of them would have required a math genius to write.
I think that some of the most important papers relating to the Singularity so far have been (in no particular order) Eliezer’s CEV proposal, Omohundro’s Basic AI Drives, Robin Hanson’s If uploads come first, Carl Shulman’s Whole brain emulations and the evolution of superorganisms, Eliezer’s Artificial Intelligence as a Positive and Negative Factor in Global Risk, Anders Sandberg’s and Nick Bostrom’s Whole brain emulation roadmap, lukeprog and Louie Helm’s The Singularity and Machine Ethics, and the intelligence explosion paper that lukeprog and Anna Salamon are writing right now. I’d also like to imagine that my two draft papers might have some value. None of those would have required a math genius for a writer—in fact, the WBE roadmap is probably the only one that required any math knowledge at all. Of possible future directions, many of the Singularity and Machine Ethics proposals can be done without being a math genius as well.
Then there are various other useful paths that are less directly associated with research. Popular writing (the career path that I’m thinking about concentrating on) might inspire countless of people to pursue FAI-related pursuits if done well. (Or turn them away from it, if done badly.)
Note that the “it’s better to earn money to fund others to do research” presumption assumes that there are people who can be hired to do research. If everyone who’s interested in FAI/Singularity issues and isn’t a math genius decides to just earn money instead, that means that the only people who can be hired to do work on FAI/Singularity issues are either math geniuses or folks who’d rather be doing something else but agree to do this because it pays the bills. It’d be better to have math geniuses and genuinely motivated folks who were ready to do the tasks that weren’t the math geniuses’ comparative advantage.