By this definition, you can make it harder for someone to be smarter than you by being more random, even though this will hinder your ability to produce utility and thus make you less intelligent by a more common definition. In the most extreme case, nothing is smarter than a true random number generator, which it seems clear is not intelligent at all.
If you can predict what someone will do, then it seems like you must be at least as intelligent as them, since you can just do what they’d do, but you might underestimate the effectiveness of their strategies and overestimate the effectiveness of your own, and thus use your own, less effective strategy. For example, perhaps Alice favors teamwork and Bob favors independence, and this is common knowledge. Each of them will be able to predict the actions of the opponent, but believe that their own actions would be more advantageous. Only one of them is more intelligent, and you’re not likely to figure out which without actually testing them to see which strategy works better.
It’s a trivial nitpick, but I feel it should be pointed out that there could be many reasons other than “the individual whose strategy worked is more intelligent” for one strategy to work better than another, especially in a single test.
If you test it multiple times in a variety of different circumstances, and one works better, then the person using it is more instrumentally rational.
Think of it like this: an intelligent being uses a variety of heuristics to figure out what to do. These heuristics need to be properly tuned to work well. It’s not that intelligent people are more capable of tuning their heuristics. It’s that tuning their heuristics is what makes them intelligent.
By this definition, you can make it harder for someone to be smarter than you by being more random, even though this will hinder your ability to produce utility and thus make you less intelligent by a more common definition. In the most extreme case, nothing is smarter than a true random number generator, which it seems clear is not intelligent at all.
If you can predict what someone will do, then it seems like you must be at least as intelligent as them, since you can just do what they’d do, but you might underestimate the effectiveness of their strategies and overestimate the effectiveness of your own, and thus use your own, less effective strategy. For example, perhaps Alice favors teamwork and Bob favors independence, and this is common knowledge. Each of them will be able to predict the actions of the opponent, but believe that their own actions would be more advantageous. Only one of them is more intelligent, and you’re not likely to figure out which without actually testing them to see which strategy works better.
It’s a trivial nitpick, but I feel it should be pointed out that there could be many reasons other than “the individual whose strategy worked is more intelligent” for one strategy to work better than another, especially in a single test.
If you test it multiple times in a variety of different circumstances, and one works better, then the person using it is more instrumentally rational.
Think of it like this: an intelligent being uses a variety of heuristics to figure out what to do. These heuristics need to be properly tuned to work well. It’s not that intelligent people are more capable of tuning their heuristics. It’s that tuning their heuristics is what makes them intelligent.