The arguments that I found most compelling for a hard takeoff are found in LOGI part 3 and the Wiki interview with Eliezer from 2003 or so, for anyone who needs help on references or argument ideas from outside of the sequences.
Yes Tim, that’s absolutely correct. That alternate meaning is complete bullshit, but it exists nonetheless. Very unfortunate, but I see very few people taking an initiative towards stomping it out in the wider world.
I think: “2) a point in time when prediction is no longer possible (a.k.a., “Predictive Horizon”)” …is equally nonsensical. Eliezer seems to agree:
“The Predictive Horizon never made much sense to me”
...and so does Nick, quoted later in the essay:
“I think it is unfortunate that some people have made Unpredictability a defining feature of “the singularity”. It really does tend to create a mental block.”
Robin Hanson thinks that the unpredictability idea is silly as well.
Yet aren’t these two the main justifications for using the “singularity” term in the first place?
If the rate of progress is not about to shoot off to infinity, and there isn’t going to be an event-horizon-like threshold at some future point in time, it seems to me that that’s two of the major justifications for using the “singularity” term down the toilet.
To me—following the agricultural/industrial terminology—it looks as though there will be an intelligence revolution—and then probably a molecular nanotechnology/robotics revolution not long after.
Squishing those two concepts together into “singularity” paste offends my sense of the naming historical events. I think it is confusing, misleading, and pseudo-scientific.
Please quit with the ridiculous singularity terminology!
The arguments that I found most compelling for a hard takeoff are found in LOGI part 3 and the Wiki interview with Eliezer from 2003 or so, for anyone who needs help on references or argument ideas from outside of the sequences.
“a point in time when the speed of technological progress becomes near-infinite (i.e., discontinuous), caused by advanced technologies”
http://www.acceleratingfuture.com/wiki/Wiki_Interview_With_Eliezer/The_Singularity
“Near infinite” is mystical math. There’s no such thing as “near infinite” in real maths. Things are either finite, or they are not.
Yes Tim, that’s absolutely correct. That alternate meaning is complete bullshit, but it exists nonetheless. Very unfortunate, but I see very few people taking an initiative towards stomping it out in the wider world.
I think: “2) a point in time when prediction is no longer possible (a.k.a., “Predictive Horizon”)” …is equally nonsensical. Eliezer seems to agree:
“The Predictive Horizon never made much sense to me”
...and so does Nick, quoted later in the essay:
“I think it is unfortunate that some people have made Unpredictability a defining feature of “the singularity”. It really does tend to create a mental block.”
Robin Hanson thinks that the unpredictability idea is silly as well.
Yet aren’t these two the main justifications for using the “singularity” term in the first place?
If the rate of progress is not about to shoot off to infinity, and there isn’t going to be an event-horizon-like threshold at some future point in time, it seems to me that that’s two of the major justifications for using the “singularity” term down the toilet.
To me—following the agricultural/industrial terminology—it looks as though there will be an intelligence revolution—and then probably a molecular nanotechnology/robotics revolution not long after.
Squishing those two concepts together into “singularity” paste offends my sense of the naming historical events. I think it is confusing, misleading, and pseudo-scientific.
Please quit with the ridiculous singularity terminology!
http://alife.co.uk/essays/the_singularity_is_nonsense/
I thought similarly about LOGI part 3 (Seed AI). I actually thought of that immediately and put a link up to that on the wiki page.