In r/futurology there is a lot more knowledge than average about singularity related issues. Some of people still aren’t aware of the dangers of AI, or believe them, or believe AI is even possible. Most the futurists I run into seem to think copying human brains into machines or uploading human intelligence will happen instead of the runaway utility maximizers that are talked about here. And there is also a lot of optimism about the future, that whatever happens will be good in the end or whatever, like the world is a movie plot.
Reddit in general doesn’t seem to talk about it much. I wouldn’t be surprised if many people didn’t even know what the singularity was, let alone take it seriously. This is just my opinion from what I have seen though, a survey or something would be more scientific, if you could get people to take it.
In r/futurology there is a lot more knowledge than average about singularity related issues. Some of people still aren’t aware of the dangers of AI, or believe them, or believe AI is even possible. Most the futurists I run into seem to think copying human brains into machines or uploading human intelligence will happen instead of the runaway utility maximizers that are talked about here. And there is also a lot of optimism about the future, that whatever happens will be good in the end or whatever, like the world is a movie plot.
Reddit in general doesn’t seem to talk about it much. I wouldn’t be surprised if many people didn’t even know what the singularity was, let alone take it seriously. This is just my opinion from what I have seen though, a survey or something would be more scientific, if you could get people to take it.
EDIT: Would → wouldn’t