Actually I don’t think you’re right. I don’t think there’s much consensus on the issue within the community, so there’s not much of a conclusion to draw:
Last year’s survey answer to “which disaster do you think is most likely to wipe out greater than 90% of humanity before the year 2100?” was as follows:
I deliberately didn’t say that the majority of LessWrongers would give that answer. Partly because Lesswrong is only about 1⁄3 computer scientists/programmers. Also 14.5% is very high compared to most communities.
I didn’t explicitly state an argument but if I were to it would be that communities with an interest in topic X are the most likely to think that topic X is the most important thing ever. So it isn’t necessary for most computer scientists to think that unfriendly AI is the biggest problem for my argument to work, just that computer scientists are the most likely to think that it is the biggest problem.
I deliberately didn’t say that the majority of LessWrongers would give that answer. Partly because Lesswrong is only about 1⁄3 computer scientists/programmers.
Fortunately we have the census and the census does ask for the profession. Among those with the profession Computers (AI), Computers (practical: IT, programming, etc.) and Computers (other academic, computer science) 14.4% think that unfriendly AI is the biggest threat.
Lesswrong isn’t a community that focuses much on bioengineered pandemics. Yet among those computer programmers 23.7% still think it’s the greatest threat.
We are a community that actually cares about data.
Actually I don’t think you’re right. I don’t think there’s much consensus on the issue within the community, so there’s not much of a conclusion to draw:
Last year’s survey answer to “which disaster do you think is most likely to wipe out greater than 90% of humanity before the year 2100?” was as follows:
Pandemic (bioengineered): 272, 23% Environmental collapse: 171, 14.5% Unfriendly AI: 160, 13.5% Nuclear war: 155, 13.1% Economic/Political collapse: 137, 11.6% Pandemic (natural): 99, 8.4% Nanotech: 49, 4.1% Asteroid: 43, 3.6%
I deliberately didn’t say that the majority of LessWrongers would give that answer. Partly because Lesswrong is only about 1⁄3 computer scientists/programmers. Also 14.5% is very high compared to most communities.
I didn’t explicitly state an argument but if I were to it would be that communities with an interest in topic X are the most likely to think that topic X is the most important thing ever. So it isn’t necessary for most computer scientists to think that unfriendly AI is the biggest problem for my argument to work, just that computer scientists are the most likely to think that it is the biggest problem.
Fortunately we have the census and the census does ask for the profession. Among those with the profession Computers (AI), Computers (practical: IT, programming, etc.) and Computers (other academic, computer science) 14.4% think that unfriendly AI is the biggest threat.
Lesswrong isn’t a community that focuses much on bioengineered pandemics. Yet among those computer programmers 23.7% still think it’s the greatest threat.
We are a community that actually cares about data.