I took the survey too. I would strongly recommend changing the Singularity question to read:
“If you don’t think a Singularity will ever happen, write N for Never”
Or something like that. The fraction of people who think Never with high probability is really interesting! You don’t want to lump them in with the people who don’t have an opinion.
I would probably be an N, but I’d need a better definition of “singularity”. In fact, I think the question would be generally more interesting if it were split into three: superhuman AI, AI which self improves with moore’s law or faster, and AI domination of the physical world at a level that would make the difference between chimpanzee technology and human technology small. All three of these could be expressed as probability of it happening before 2100, because such a probability should still have enough information to let you mostly distinguish between a “not for a long time” and a “never”.
I took the survey too. I would strongly recommend changing the Singularity question to read:
“If you don’t think a Singularity will ever happen, write N for Never”
Or something like that. The fraction of people who think Never with high probability is really interesting! You don’t want to lump them in with the people who don’t have an opinion.
I would probably be an N, but I’d need a better definition of “singularity”. In fact, I think the question would be generally more interesting if it were split into three: superhuman AI, AI which self improves with moore’s law or faster, and AI domination of the physical world at a level that would make the difference between chimpanzee technology and human technology small. All three of these could be expressed as probability of it happening before 2100, because such a probability should still have enough information to let you mostly distinguish between a “not for a long time” and a “never”.