I would probably be an N, but I’d need a better definition of “singularity”. In fact, I think the question would be generally more interesting if it were split into three: superhuman AI, AI which self improves with moore’s law or faster, and AI domination of the physical world at a level that would make the difference between chimpanzee technology and human technology small. All three of these could be expressed as probability of it happening before 2100, because such a probability should still have enough information to let you mostly distinguish between a “not for a long time” and a “never”.
I would probably be an N, but I’d need a better definition of “singularity”. In fact, I think the question would be generally more interesting if it were split into three: superhuman AI, AI which self improves with moore’s law or faster, and AI domination of the physical world at a level that would make the difference between chimpanzee technology and human technology small. All three of these could be expressed as probability of it happening before 2100, because such a probability should still have enough information to let you mostly distinguish between a “not for a long time” and a “never”.
Oops… this was meant to be a