Let’s not jump down his throat. It’s a current evaluation from shallow research, not an expert-level essay.
I will proceed to jump down his throat.
vague claims about technology, IQ having a fundamental bound, and IQ sucking as a metric anyway
That’s rather too vague to analyze.
If being really smart won’t help you (on real-life instances, not just asymptotically) because you’re jumping up the hierarchy, there’s still a lot to get from improving heuristics, looking into increasingly specialised heuristics, and just throwing more power at the problem. But we don’t have a model detailed enough to provide a bound at all!
Singularity fans are right about two things: machines will outthink humans within fifty to a hundred years, and the pace of technological advancement will accelerate.
Okay, either he’s agreeing with Singularitarians but doesn’t want to admit it, or he expect tech will run into a wall really fast for no specified reason.
this is already accounted for by existing models of technological advancement, e.g. Moore’s Law
...nobody is denying that surface laws like these exist. Singularitarians are claiming that there are deeper reasons why these models are and stay true. Next he’s going to tell us that Newton’s laws are useless because we already have a parabolic model of freefall.
The Singularity simply describes one way this pace will be maintained: by the recruitment of AI.
Ehn, two schools out of three ain’t bad.
It therefore doesn’t predict anything remarkable
If creating the smartest thing in the universe is unremarkable, I want to see what impresses Carrier.
certainly doesn’t deserve such a pretentious name
I have to back him on that one.
there will be a limit, an end point
What is wrong with people that makes them understand “a bound exists” as “the bound is smallish”?
we can already do this now, yet we don’t see moon-sized computers anywhere—a fact that reveals an importantly overlooked reality: what things cost
...yes, we don’t see moon-sized computers because they’re more expensive for the same performance gain than reducing and speeding up individual components. When those avenues are exhausted, it will become much more economically viable to make huge computers.
What is wrong with people that makes them understand “a bound exists” as “the bound is smallish”?
Modus tollens: “no small bound exists” --> “no bound exists”, e.g. life extension is immortality, (but immortality is physically impossible, so too must be life extension).
Let’s not jump down his throat. It’s a current evaluation from shallow research, not an expert-level essay.
I will proceed to jump down his throat.
That’s rather too vague to analyze.
If being really smart won’t help you (on real-life instances, not just asymptotically) because you’re jumping up the hierarchy, there’s still a lot to get from improving heuristics, looking into increasingly specialised heuristics, and just throwing more power at the problem. But we don’t have a model detailed enough to provide a bound at all!
Okay, either he’s agreeing with Singularitarians but doesn’t want to admit it, or he expect tech will run into a wall really fast for no specified reason.
...nobody is denying that surface laws like these exist. Singularitarians are claiming that there are deeper reasons why these models are and stay true. Next he’s going to tell us that Newton’s laws are useless because we already have a parabolic model of freefall.
Ehn, two schools out of three ain’t bad.
If creating the smartest thing in the universe is unremarkable, I want to see what impresses Carrier.
I have to back him on that one.
What is wrong with people that makes them understand “a bound exists” as “the bound is smallish”?
...yes, we don’t see moon-sized computers because they’re more expensive for the same performance gain than reducing and speeding up individual components. When those avenues are exhausted, it will become much more economically viable to make huge computers.
.
Modus tollens: “no small bound exists” --> “no bound exists”, e.g. life extension is immortality, (but immortality is physically impossible, so too must be life extension).