Game theory is not like calculus or evolutionary theory—something any alien race smart enough to develop space travel is likely to formulate. It does represent human values.
You solve games by having solution criteria . Unfortunately, for any reasonable list of solution criteria you will always be able to find games where the result doesn’t seem to make sense. Also, there is no set of obviously correct and complete solution concepts. Consider the following game:
Two rational people simultaneously and secretly write down a real number [0,100]. The person who writes down the highest number gets a payoff of zero, and the person who writes down the lowest number gets that as his payoff. If there is a tie they each get zero. What happens?
The only “Nash equilibrium” (the most important solution concept in all of game theory) is for both players to write down 0, but this is a crazy result because picking 0 is weakly dominated by picking any other number (expect 100).
Game theory also has trouble solving many games where (a) Player Two only gets to move if Player One does a certain thing, (b) Player One’s strategy is determined by what he expects Player Two would do if Player Two gets to move, and (c) in equilibrium Player Two never moves.
Are you agreeing or disagreeing with “the things you describe in this post seem to be the kind of maths a smart alien race might discover just like we did”?
It depends on what you mean by “might” and “discover” (as opposed to invent). I predict that smart aliens’ theories of physics, chemistry, and evolution would be much more similar to ours than their theories of how rational people play games would be.
How so? Game theory basically studies interactions between two (or more) agents which can make choices the outcome of which depends on what the other agent does. You can use game theory to model interaction between two pieces of software, for example.
I still don’t see what does all this have to do with human values.
I am talking about game theory as a field of inquiry. You’re talking about the current state of the art in this field and pointing out that it has unsolved issues. So? Physics has unsolved issues, too.
I still don’t see what does all this have to do with human values.
I also don’t understand what does it mean for game theory to “be solved”. If you mean that in certain specific situations you don’t get an answer, that’s true for physics as well.
Game theory would be solved if there were a set of reasonable criteria which, if applied to every possible game of rational players, would cause you to know what the players would do.
Game theory would be solved if there were a set of reasonable criteria which, if applied to every possible game of rational players, would cause you to know what the players would do.
To continue with physics: physics would be solved if there were a set of reasonable criteria which, if applied to every possible interaction of particles, would cause you to know what the particles would do.
Consider a situation in which using physics you could prove that (1) X won’t happen, and (2) X will happen. If this situation existed physics wouldn’t be capable of being solved, but my understanding of science is that such a situation is unlikely to exist. Alas, this kind of situation does come up in game theory.
Whether you get an answer is dependent on the criteria you choose, but these criteria must have arbitrariness in them even for rational people. Consider the solution concept “never play a weakly dominated strategy.” This is neither right nor wrong but an arbitrary criteria that reflects human values.
Saying “the game theory solution is A,Y” is closer to “this picture is pretty” than “the electron will...”
Also, assuming someone is rational and wants to maximize his payoff isn’t enough to fully specify him, and consequently you need to bring in human values to figure out how this person will behave.
You seem to be talking about forecasting human behavior and giving advice to humans about how to behave.
That, of course, depends on human values. But that is related to game theory in the same way engineering is related to mathematics. If you are building a bridge you need to know the properties of materials you’re building it out of. Doesn’t change the equations, though.
You know that a race of aliens is rational. Do you need to know more about their values to predict how they will build bridges? Yes. Do you need to know more about their values to predict how they will play games? Yes.
Game theory is (basically) the study of how rational people behave. Unfortunately, there will always exist relatively simple games for which you can not use the tools of game theory to determine how players will behave.
Game theory is (basically) the study of how rational people behave.
Ah. We have a terminology difference. I defined my understanding of game theory a bit upthread and it’s not about people at all. For example, consider software agents operating in a network with distributed resources and untrusted counterparties.
Game theory is not like calculus or evolutionary theory—something any alien race smart enough to develop space travel is likely to formulate. It does represent human values.
Can you explain this? I always thought of game theory as being like calculus, and not about human values (like this comment says).
You solve games by having solution criteria . Unfortunately, for any reasonable list of solution criteria you will always be able to find games where the result doesn’t seem to make sense. Also, there is no set of obviously correct and complete solution concepts. Consider the following game:
Two rational people simultaneously and secretly write down a real number [0,100]. The person who writes down the highest number gets a payoff of zero, and the person who writes down the lowest number gets that as his payoff. If there is a tie they each get zero. What happens?
The only “Nash equilibrium” (the most important solution concept in all of game theory) is for both players to write down 0, but this is a crazy result because picking 0 is weakly dominated by picking any other number (expect 100).
Game theory also has trouble solving many games where (a) Player Two only gets to move if Player One does a certain thing, (b) Player One’s strategy is determined by what he expects Player Two would do if Player Two gets to move, and (c) in equilibrium Player Two never moves.
I’m not understanding you, the things you describe in this post seem to be the kind of maths a smart alien race might discover just like we did.
Many games don’t have solutions, or the solutions depend on arbitrary criteria.
… and?
Are you agreeing or disagreeing with “the things you describe in this post seem to be the kind of maths a smart alien race might discover just like we did”?
It depends on what you mean by “might” and “discover” (as opposed to invent). I predict that smart aliens’ theories of physics, chemistry, and evolution would be much more similar to ours than their theories of how rational people play games would be.
How so? Game theory basically studies interactions between two (or more) agents which can make choices the outcome of which depends on what the other agent does. You can use game theory to model interaction between two pieces of software, for example.
Please see my answer to PECOS-9.
I still don’t see what does all this have to do with human values.
I am talking about game theory as a field of inquiry. You’re talking about the current state of the art in this field and pointing out that it has unsolved issues. So? Physics has unsolved issues, too.
There are proofs showing that game theory can never be solved.
I still don’t see what does all this have to do with human values.
I also don’t understand what does it mean for game theory to “be solved”. If you mean that in certain specific situations you don’t get an answer, that’s true for physics as well.
Game theory would be solved if there were a set of reasonable criteria which, if applied to every possible game of rational players, would cause you to know what the players would do.
To continue with physics: physics would be solved if there were a set of reasonable criteria which, if applied to every possible interaction of particles, would cause you to know what the particles would do.
Consider a situation in which using physics you could prove that (1) X won’t happen, and (2) X will happen. If this situation existed physics wouldn’t be capable of being solved, but my understanding of science is that such a situation is unlikely to exist. Alas, this kind of situation does come up in game theory.
Well, it’s math but...
Whether you get an answer is dependent on the criteria you choose, but these criteria must have arbitrariness in them even for rational people. Consider the solution concept “never play a weakly dominated strategy.” This is neither right nor wrong but an arbitrary criteria that reflects human values.
Saying “the game theory solution is A,Y” is closer to “this picture is pretty” than “the electron will...”
Also, assuming someone is rational and wants to maximize his payoff isn’t enough to fully specify him, and consequently you need to bring in human values to figure out how this person will behave.
You seem to be talking about forecasting human behavior and giving advice to humans about how to behave.
That, of course, depends on human values. But that is related to game theory in the same way engineering is related to mathematics. If you are building a bridge you need to know the properties of materials you’re building it out of. Doesn’t change the equations, though.
You know that a race of aliens is rational. Do you need to know more about their values to predict how they will build bridges? Yes. Do you need to know more about their values to predict how they will play games? Yes.
Game theory is (basically) the study of how rational people behave. Unfortunately, there will always exist relatively simple games for which you can not use the tools of game theory to determine how players will behave.
Ah. We have a terminology difference. I defined my understanding of game theory a bit upthread and it’s not about people at all. For example, consider software agents operating in a network with distributed resources and untrusted counterparties.