I suspect there’s a school of thought for which “singularity” was massively overoptimistic—is this what you mean by Kurzweilian magical thinking? That it’s a transition in a very short period of time from scarcity-based capitalism to post-scarcity utopia. Rather than a simple destruction of most of humanity, and of the freedom and value of those remaining.
That it’s a transition in a very short period of time from scarcity-based capitalism to post-scarcity utopia.
No, that part of Kurzweil’s view is 100% fine. In fact, I believe I expect a sharper transition than Kurzweil expects. My objection to Kurzweil’s thinking isn’t ‘realistic mature futurists are supposed to be pessimistic across the board’, it’s specific unsupported flaws in his arguments:
Rejection of Eliezer’s five theses (which were written in response to Kurzweil): intelligence explosion, orthogonality, convergent instrumental goals, complexity of value, fragility of value.
Otherwise weird and un-Bayesian-sounding attitudes toward forecasting. Seems to think he has a crystal ball that lets him exactly time tech developments, even where he has no model of a causal path by which he could be entangled with evidence about that future development...?
I suspect there’s a school of thought for which “singularity” was massively overoptimistic—is this what you mean by Kurzweilian magical thinking? That it’s a transition in a very short period of time from scarcity-based capitalism to post-scarcity utopia. Rather than a simple destruction of most of humanity, and of the freedom and value of those remaining.
No, that part of Kurzweil’s view is 100% fine. In fact, I believe I expect a sharper transition than Kurzweil expects. My objection to Kurzweil’s thinking isn’t ‘realistic mature futurists are supposed to be pessimistic across the board’, it’s specific unsupported flaws in his arguments:
Rejection of Eliezer’s five theses (which were written in response to Kurzweil): intelligence explosion, orthogonality, convergent instrumental goals, complexity of value, fragility of value.
Mystical, quasi-Hegelian thinking about surface trends like ‘economic growth’. See the ‘Actual Ray Kurzweil’ quote in https://www.lesswrong.com/posts/ax695frGJEzGxFBK4/biology-inspired-agi-timelines-the-trick-that-never-works.
Otherwise weird and un-Bayesian-sounding attitudes toward forecasting. Seems to think he has a crystal ball that lets him exactly time tech developments, even where he has no model of a causal path by which he could be entangled with evidence about that future development...?