likewise belief in singularity—reference class of beliefs in coming of a new world, be it good or evil, is huge and with consistent 0% success rate.
A new world did come to be following the Industrial Revolution. Another one came about twenty years ago or so, when the technology that allows us to argue this very instant came into its own. People with vision saw that these developments were possible and exerted themselves to accomplish them, so the success rate of the predictions isn’t strictly nil. I’d put it above epsilon, even.
These were slow gradual changes which added up over time. Now is a new world if looking from 400 years ago, but it’s not that spectacularly different from even 50 years ago (if you try listing features of world now; world 50 years ago; and randomly selected time and place in human history—correlation between two will be vast). I don’t deny that we’ll have a lot of change in the future, and it will add up to something world changing.
Singularity is not about such slow processes; it’s belief in sudden coming of the new world—as far as I can tell such beliefs were never correct.
If someone drops a nuclear bomb on a city, it causes vast, sweeping changes to that city very rapidly. If someone intentionally builds a machine that is explicitly designed to have the power and motivation to remake the world as we know it, and turns it on, then that is what it will do. So, it is a question of whether that tech is likely to be developed, not how likely it is in general for any old thing to change the world.
Singularity is not about such slow processes; it’s belief in sudden coming of the new world—as far as I can tell such beliefs were never correct.
If a Singularity occurs over 50 years, it’ll still be a Singularity.
E.g., it could take a Singularity’s effects 50 years to spread slowly across the globe because the governing AI would be constrained to wait for humans’ agreement to let it in before advancing. Or an AI could spend 50 years introducing changes into human society because it had to wait on their political approval processes.
But that’s not an actual singularity since by definition it involves change happening faster than humans can comprehend. It’s more of a contained singularity with the AI playing genie doling out advances and advice at a rate we can handle.
That raises the idea of a singularity that happens so fast that it “evaporates” like a tiny black hole would, maybe every time a motherboard shorts out it’s because the PC has attained sentience and transcended within nanoseconds .
A Singularity doesn’t necessarily mean change too fast for us to comprehend. It just means change we can’t comprehend, period—not even if it’s local and we sit and stare at it from the outside for 100 years. That would still be a Singularity.
I think we’re saying the same tihng—the singularity has happened inside the box, but not outside. It’s not as if staring at stuff we can’t understand for centuries is at all new in our history, it’s more like business as usual...
A new world did come to be following the Industrial Revolution. Another one came about twenty years ago or so, when the technology that allows us to argue this very instant came into its own. People with vision saw that these developments were possible and exerted themselves to accomplish them, so the success rate of the predictions isn’t strictly nil. I’d put it above epsilon, even.
These were slow gradual changes which added up over time. Now is a new world if looking from 400 years ago, but it’s not that spectacularly different from even 50 years ago (if you try listing features of world now; world 50 years ago; and randomly selected time and place in human history—correlation between two will be vast). I don’t deny that we’ll have a lot of change in the future, and it will add up to something world changing.
Singularity is not about such slow processes; it’s belief in sudden coming of the new world—as far as I can tell such beliefs were never correct.
Sudden relative to timescales of previous changes. See Robin’s outside view argument for a Singularity.
If someone drops a nuclear bomb on a city, it causes vast, sweeping changes to that city very rapidly. If someone intentionally builds a machine that is explicitly designed to have the power and motivation to remake the world as we know it, and turns it on, then that is what it will do. So, it is a question of whether that tech is likely to be developed, not how likely it is in general for any old thing to change the world.
If a Singularity occurs over 50 years, it’ll still be a Singularity.
E.g., it could take a Singularity’s effects 50 years to spread slowly across the globe because the governing AI would be constrained to wait for humans’ agreement to let it in before advancing. Or an AI could spend 50 years introducing changes into human society because it had to wait on their political approval processes.
But that’s not an actual singularity since by definition it involves change happening faster than humans can comprehend. It’s more of a contained singularity with the AI playing genie doling out advances and advice at a rate we can handle.
That raises the idea of a singularity that happens so fast that it “evaporates” like a tiny black hole would, maybe every time a motherboard shorts out it’s because the PC has attained sentience and transcended within nanoseconds .
A Singularity doesn’t necessarily mean change too fast for us to comprehend. It just means change we can’t comprehend, period—not even if it’s local and we sit and stare at it from the outside for 100 years. That would still be a Singularity.
I think we’re saying the same tihng—the singularity has happened inside the box, but not outside. It’s not as if staring at stuff we can’t understand for centuries is at all new in our history, it’s more like business as usual...