In the “The Maes-Garreau Point” Kevin Kelly lists poorly-referenced predictions of “when they think the Singularity will appear” of 2001, 2004 and 2005 - by Nick Hogard, Nick Bostrom and Eleizer Yudkowsky respectively.
But only a potential warning sign—fusion power is always 25 years away, but so is the decay of a Promethium-145 atom.
Right, but we expect that for the promethium atom. If physicists had predicted that a certain radioactive sample would decay in a fixed time, and they kept pushing up the time for when it would happen, and didn’t alter their hypotheses at all, I’d be very worried about the state of physics.
Not off the top of my head, which is one reason I didn’t bring it up until I got pissed off :) I remember a number of people predicting 2000, over the last decades of the 20th century, I think Turing himself was one of the earliest.
Turing never discussed much like a Singularity to my knowledge. What you may be thinking of is how in his original article proposing the Turing Test he said that he expected that it would take around fifty years for machines to pass the Turing Test. He wrote the essay in 1950. But, Turing’s remark is not the same claim as a Singularity occurring in 2000. Turing was off for when we’d have AI. As far as I know, he didn’t comment on anything like a Singularity.
Ah, that’s the one I’m thinking of—he didn’t comment on a Singularity, but did predict human level AI by 2000. Some later people did, but I didn’t save any citations at the time and a quick Google search didn’t find any, which is one of the reasons I’m not writing a post on failed Singularity predictions.
Another reason, hopefully, is that there would always have been a wide range of predictions, and there’s a lot of room for proving points by being selective about which ones to highlight, and even if you looked at all predictions there are selection effects in that the ones that were repeated or even stated in the first place tend to be the more extreme ones.
So, I’m vaguely aware of Singularity claims for 2010. Do you have citations for people making such claims that it would happen in 2000 or 2005?
I agree that pushing something farther and farther into the future is a potential warning sign.
In the “The Maes-Garreau Point” Kevin Kelly lists poorly-referenced predictions of “when they think the Singularity will appear” of 2001, 2004 and 2005 - by Nick Hogard, Nick Bostrom and Eleizer Yudkowsky respectively.
But only a potential warning sign—fusion power is always 25 years away, but so is the decay of a Promethium-145 atom.
Right, but we expect that for the promethium atom. If physicists had predicted that a certain radioactive sample would decay in a fixed time, and they kept pushing up the time for when it would happen, and didn’t alter their hypotheses at all, I’d be very worried about the state of physics.
Not off the top of my head, which is one reason I didn’t bring it up until I got pissed off :) I remember a number of people predicting 2000, over the last decades of the 20th century, I think Turing himself was one of the earliest.
Turing never discussed much like a Singularity to my knowledge. What you may be thinking of is how in his original article proposing the Turing Test he said that he expected that it would take around fifty years for machines to pass the Turing Test. He wrote the essay in 1950. But, Turing’s remark is not the same claim as a Singularity occurring in 2000. Turing was off for when we’d have AI. As far as I know, he didn’t comment on anything like a Singularity.
Ah, that’s the one I’m thinking of—he didn’t comment on a Singularity, but did predict human level AI by 2000. Some later people did, but I didn’t save any citations at the time and a quick Google search didn’t find any, which is one of the reasons I’m not writing a post on failed Singularity predictions.
Another reason, hopefully, is that there would always have been a wide range of predictions, and there’s a lot of room for proving points by being selective about which ones to highlight, and even if you looked at all predictions there are selection effects in that the ones that were repeated or even stated in the first place tend to be the more extreme ones.