Yes and first of all, why are you even attempting to add “2x”. A reasonable argument would be “~1x”, as in, the total storage of all state outside the body is so small it can be neglected.
I mean...sure...but again, this does not affect the validity of my counterargument. Like I said, I’m using as strong as possible of a counterargument by saying that even if the non-brain parts of the body were to add 2-100x computing power, this would not restrict our ability to scale up NNs to get human-level cognition. Obviously this still holds if we replace “2-100x” with “1x”.
The advantage of “2-100x” is that it is extraordinarily charitable to the “embodied cognition” theory—if (and I consider this to be extremely low probability) embodied cognition does turn out to be highly true in some strong sense, then “2-100x” takes care of this in a way that “~1x” does not. And I may as well be extraordinarily charitable to the embodied cognition theory, since “Bitter lesson” type reasoning is independent of its veracity.
Yes and first of all, why are you even attempting to add “2x”. A reasonable argument would be “~1x”, as in, the total storage of all state outside the body is so small it can be neglected.
I mean...sure...but again, this does not affect the validity of my counterargument. Like I said, I’m using as strong as possible of a counterargument by saying that even if the non-brain parts of the body were to add 2-100x computing power, this would not restrict our ability to scale up NNs to get human-level cognition. Obviously this still holds if we replace “2-100x” with “1x”.
The advantage of “2-100x” is that it is extraordinarily charitable to the “embodied cognition” theory—if (and I consider this to be extremely low probability) embodied cognition does turn out to be highly true in some strong sense, then “2-100x” takes care of this in a way that “~1x” does not. And I may as well be extraordinarily charitable to the embodied cognition theory, since “Bitter lesson” type reasoning is independent of its veracity.