One thing this essay does not address is whether humans actually are likely to learn heuristics from playing videogames or whether a large enough fraction of the population plays videogames for this to be a real concern.
Let’s briefly address that: There’s a fair bit of evidence that much of “play” behavior over a wide variety of species is specifically to learn behavior and rules for actual life events. Thus for example, wolf cubs engage in mock fights which prepare them for more serious events. Some species of corvids (crows, ravens, jays, etc.) will actively play with the large predators in their area (pecking at their tails for example or dropping objects on their faces) which is an apparent attempt to learn about the general behavior of the predators which is primarily important because these species of corvids get much of their food from scavenging. It is likely that humans engage in play behavior in part for similar reasons. If so, there’s a real danger of people learning bad heuristics from videogames.
What percentage of the population plays videogames? A quick Google search turns up various numbers which disagree but it seems that they vary from around a third to slightly over half. See for example here. Given that, this seems like a common enough issue to be worth discussing.
Is it obvious that a videogame is enough like the play a human child would do in the ancestral environment that it will activate the learning-by-play circuits? Our enjoyment does not imply that it is play in the sense our learning circuits recognise.
Play is about learning. Even games that we don’t think of as teaching us anything are fundamentally tied into our learning circuits. Even games as mindless as solitaire (Klondike) activate our learning circuitry regardless of whether or not we actually develop any skills by playing them—like an artificial neural network continuing to train past the point of usefulness, fluctuating around its plateau with every new example it sees.
One of the most difficult aspects of video game design is scheduling difficulty increases; ideally, a game gets harder at the same pace that the gamer gets better, because getting better feels good. Engaging those learning circuits is one of the primary reasons games are fun in the first place.
The real question to ask is whether learning bad habits in video games translates even a little bit to real life. This is an age-old debate, most frequently brought up when somebody claims that playing violent first-person shooters turns innocent children into mass-murdering psychopaths.
But we learn by working too—especially if work is somewhat playful. Yes videogames teach us, but they teach us while producing absolutely nothing, whereas work teaches us, may be less fun, but actually produces value (makes the world better, you get paid rather than paying, etc.) And what you learn at work (or any productive enterprise—maybe it’s an art project for Burning Man, or a self-improvement project) is much more likely to be what you need to know for future work. Whereas what you learn in a game may happen to be useful later, but also may not.
I agree with everything you said. We should be especially cautious about playing so-called casual games, where the modal player very quickly reaches a plateau beyond which he or she will learn nothing, much less learn anything useful.
The difference of course is that the learning process in Real Life is slooooow. In game X, after 30 hours of play, the modal player may be one or two orders of magnitude better at a given skill (one that is at least somewhat unique to the game) than someone who has been playing for two hours. Some games (e.g., some first-person shooters) require no unique skills; I suspect the skill curve looks similar, but most players are hundreds of hours along it rather than just a few, so the curve appears flatter and the differences proportionally smaller.
Contrast that to life: in mine, the skills I am trying to cultivate are the same ones that I’ve been trying to cultivate for years, and my improvement is sometimes so glacial that I feel doubt as to whether I’m getting better at all. I could just be thousands of hours along similarly shaped curves, but I have certainly reached the point where I no longer see incremental improvement: all I see anymore are occasional insights that discretely improve my abilities.
So, Playey Work is much better for us and for the world at large than Workey Play. The difficulty comes in valuing that value; truly internalizing that need to produce and be valued so that it overwhelms the additional fun that video games offer and will always be able to offer.
I earnestly want to know how to do that. If there’s an optimization that could be applied to humans, I think that eliminating cognitive biases would do less to improve the human condition than increasing the weight we each internally attach to producing value for others. For just me, if I could do that, through meditation or cognitive behavior therapy or hypnosis or whatever, I would finish my dissertation right quick and just as quickly move on to doing the things that I know I want to do, but right now want less than to spend a dozen hours playing another video game.
One thing this essay does not address is whether humans actually are likely to learn heuristics from playing videogames or whether a large enough fraction of the population plays videogames for this to be a real concern.
Let’s briefly address that: There’s a fair bit of evidence that much of “play” behavior over a wide variety of species is specifically to learn behavior and rules for actual life events. Thus for example, wolf cubs engage in mock fights which prepare them for more serious events. Some species of corvids (crows, ravens, jays, etc.) will actively play with the large predators in their area (pecking at their tails for example or dropping objects on their faces) which is an apparent attempt to learn about the general behavior of the predators which is primarily important because these species of corvids get much of their food from scavenging. It is likely that humans engage in play behavior in part for similar reasons. If so, there’s a real danger of people learning bad heuristics from videogames.
What percentage of the population plays videogames? A quick Google search turns up various numbers which disagree but it seems that they vary from around a third to slightly over half. See for example here. Given that, this seems like a common enough issue to be worth discussing.
Is it obvious that a videogame is enough like the play a human child would do in the ancestral environment that it will activate the learning-by-play circuits? Our enjoyment does not imply that it is play in the sense our learning circuits recognise.
Play is about learning. Even games that we don’t think of as teaching us anything are fundamentally tied into our learning circuits. Even games as mindless as solitaire (Klondike) activate our learning circuitry regardless of whether or not we actually develop any skills by playing them—like an artificial neural network continuing to train past the point of usefulness, fluctuating around its plateau with every new example it sees.
One of the most difficult aspects of video game design is scheduling difficulty increases; ideally, a game gets harder at the same pace that the gamer gets better, because getting better feels good. Engaging those learning circuits is one of the primary reasons games are fun in the first place.
The real question to ask is whether learning bad habits in video games translates even a little bit to real life. This is an age-old debate, most frequently brought up when somebody claims that playing violent first-person shooters turns innocent children into mass-murdering psychopaths.
But we learn by working too—especially if work is somewhat playful. Yes videogames teach us, but they teach us while producing absolutely nothing, whereas work teaches us, may be less fun, but actually produces value (makes the world better, you get paid rather than paying, etc.) And what you learn at work (or any productive enterprise—maybe it’s an art project for Burning Man, or a self-improvement project) is much more likely to be what you need to know for future work. Whereas what you learn in a game may happen to be useful later, but also may not.
Playey Work >> Worky Play
I agree with everything you said. We should be especially cautious about playing so-called casual games, where the modal player very quickly reaches a plateau beyond which he or she will learn nothing, much less learn anything useful.
The difference of course is that the learning process in Real Life is slooooow. In game X, after 30 hours of play, the modal player may be one or two orders of magnitude better at a given skill (one that is at least somewhat unique to the game) than someone who has been playing for two hours. Some games (e.g., some first-person shooters) require no unique skills; I suspect the skill curve looks similar, but most players are hundreds of hours along it rather than just a few, so the curve appears flatter and the differences proportionally smaller.
Contrast that to life: in mine, the skills I am trying to cultivate are the same ones that I’ve been trying to cultivate for years, and my improvement is sometimes so glacial that I feel doubt as to whether I’m getting better at all. I could just be thousands of hours along similarly shaped curves, but I have certainly reached the point where I no longer see incremental improvement: all I see anymore are occasional insights that discretely improve my abilities.
So, Playey Work is much better for us and for the world at large than Workey Play. The difficulty comes in valuing that value; truly internalizing that need to produce and be valued so that it overwhelms the additional fun that video games offer and will always be able to offer.
I earnestly want to know how to do that. If there’s an optimization that could be applied to humans, I think that eliminating cognitive biases would do less to improve the human condition than increasing the weight we each internally attach to producing value for others. For just me, if I could do that, through meditation or cognitive behavior therapy or hypnosis or whatever, I would finish my dissertation right quick and just as quickly move on to doing the things that I know I want to do, but right now want less than to spend a dozen hours playing another video game.