I think this particular argument would dissolve away if the paper said “may allow AIs to acquire vastly more optimization power” instead of “vastly more intelligent”.
The key point here is not that AIs have more computational resources available than humans, but that they are (presumed) able to translate extra computational resource directly into extra cognitive power. So they can use that particular resource much more efficiently than humans.
EDIT: actually I’m confusing two concepts here. There’s “computer hardware”, which is an external resource that AIs are better at utilizing than we are. Then there’s “computational power” which AIs obtain from computer hardware and we obtain from our brains. This is an internal resource, and while I believe it’s what the paper was referring to as “increased computational resources”, I’m not sure it counts as a “resource” for the purposes of Yudkowsky’s definition of intelligence.
I think this particular argument would dissolve away if the paper said “may allow AIs to acquire vastly more optimization power” instead of “vastly more intelligent”.
The key point here is not that AIs have more computational resources available than humans, but that they are (presumed) able to translate extra computational resource directly into extra cognitive power. So they can use that particular resource much more efficiently than humans.
EDIT: actually I’m confusing two concepts here. There’s “computer hardware”, which is an external resource that AIs are better at utilizing than we are. Then there’s “computational power” which AIs obtain from computer hardware and we obtain from our brains. This is an internal resource, and while I believe it’s what the paper was referring to as “increased computational resources”, I’m not sure it counts as a “resource” for the purposes of Yudkowsky’s definition of intelligence.