The neglected missing piece is understanding of intelligence. (Not understanding how to solve a a Rubik cube, but understanding the generalized process that upon seeing a Rubik cube for the first time, and hearing the goal of the puzzle, figures out how to solve it.)
The point you are missing is that understanding intelligence is not a binary thing. It is a task that will be subject to the curve of capability like any other complex task.
What is binary is whether or not you understand intelligence enough to implement it on a computer that then will not need to rely on any much slower human brain for any aspect of cognition.
A useful analogy here is self replicating factories. At one time they were proposed as a binary thing, just make a single self replicating factory that does not need to rely on humans for any aspect of manufacturing and you have an unlimited supply of manufactured goods thereafter.
It was discovered that in reality it’s about as far from binary as you can get. While in principle such a thing must be possible, in practice it’s so far off as to be entirely irrelevant to today’s plans; what is real is a continuous curve of increasing automation. By the time our distant descendants are in a position to automate the last tiny fraction, it may not even matter anymore.
Regardless, it doesn’t matter. Computers are bound by the curve of capability just as much as humans are. There is no evidence that they’re going to possess any special sauce that will allow them past it, and plenty of evidence that they aren’t.
My theory is falsifiable. Is yours? If so, what evidence would you agree would falsify it, if said evidence were obtained?
The neglected missing piece is understanding of intelligence. (Not understanding how to solve a a Rubik cube, but understanding the generalized process that upon seeing a Rubik cube for the first time, and hearing the goal of the puzzle, figures out how to solve it.)
The point you are missing is that understanding intelligence is not a binary thing. It is a task that will be subject to the curve of capability like any other complex task.
What is binary is whether or not you understand intelligence enough to implement it on a computer that then will not need to rely on any much slower human brain for any aspect of cognition.
A useful analogy here is self replicating factories. At one time they were proposed as a binary thing, just make a single self replicating factory that does not need to rely on humans for any aspect of manufacturing and you have an unlimited supply of manufactured goods thereafter.
It was discovered that in reality it’s about as far from binary as you can get. While in principle such a thing must be possible, in practice it’s so far off as to be entirely irrelevant to today’s plans; what is real is a continuous curve of increasing automation. By the time our distant descendants are in a position to automate the last tiny fraction, it may not even matter anymore.
Regardless, it doesn’t matter. Computers are bound by the curve of capability just as much as humans are. There is no evidence that they’re going to possess any special sauce that will allow them past it, and plenty of evidence that they aren’t.
My theory is falsifiable. Is yours? If so, what evidence would you agree would falsify it, if said evidence were obtained?