If you give me so much money that I’m no longer getting much marginal value from it, you’re not actually making me stupider.
In Yudkowsky’s model, it’s resources needed to complete the task. If you can solve problem x for 5 dollars, and I give you a million dollars, you can still solve problem x for 5 dollars.
There’s another flaw in the model which I presented, which is that I was only thinking about goals which conflict with other agents’ goals. “Solve problem x for $5”-type tasks may not fall into that category, but may still require a lot of “intelligence” to solve. (Although narrow intelligence may be enough).
In Yudkowsky’s model, it’s resources needed to complete the task. If you can solve problem x for 5 dollars, and I give you a million dollars, you can still solve problem x for 5 dollars.
Good point—I’d missed that particular subtlety.
There’s another flaw in the model which I presented, which is that I was only thinking about goals which conflict with other agents’ goals. “Solve problem x for $5”-type tasks may not fall into that category, but may still require a lot of “intelligence” to solve. (Although narrow intelligence may be enough).