If intelligence=optimization power/resources used, this might well be the case. Nonetheless, this “intelligence implosion” would still involve entities with increasing resources and thus increasing optimization power. A stupid agent with a lot of optimization power (Clippy) is still dangerous.
What I’m arguing is that dividing by resource consumption is an odd way to define intelligence. For example, under this definition is a mouse more intelligent than an ant? Clearly a mouse has much more optimisation power, but it also has a vastly larger brain. So once you divide out the resource difference, maybe ants are more intelligent than mice? It’s not at all clear. That this could even be a possibility runs strongly counter to the everyday meaning of intelligence, as well as definitions given by psychologists (as Tim Tyler pointed out above).
If intelligence=optimization power/resources used, this might well be the case. Nonetheless, this “intelligence implosion” would still involve entities with increasing resources and thus increasing optimization power. A stupid agent with a lot of optimization power (Clippy) is still dangerous.
I agree that it would be dangerous.
What I’m arguing is that dividing by resource consumption is an odd way to define intelligence. For example, under this definition is a mouse more intelligent than an ant? Clearly a mouse has much more optimisation power, but it also has a vastly larger brain. So once you divide out the resource difference, maybe ants are more intelligent than mice? It’s not at all clear. That this could even be a possibility runs strongly counter to the everyday meaning of intelligence, as well as definitions given by psychologists (as Tim Tyler pointed out above).