I observe that many futuristic predictions are, likewise, best considered as attitude expressions. Take the question, “How long will it be until we have human-level AI?” The responses I’ve seen to this are all over the map. On one memorable occasion, a mainstream AI guy said to me, “Five hundred years.” (!!)
Did you ask any of them how long they felt it would take to develop other “futuristic” technologies? (in other words, their rank ordering of technological changes).
I observe that many futuristic predictions are, likewise, best considered as attitude expressions. Take the question, “How long will it be until we have human-level AI?” The responses I’ve seen to this are all over the map. On one memorable occasion, a mainstream AI guy said to me, “Five hundred years.” (!!)
Did you ask any of them how long they felt it would take to develop other “futuristic” technologies? (in other words, their rank ordering of technological changes).