Play is about learning. Even games that we don’t think of as teaching us anything are fundamentally tied into our learning circuits. Even games as mindless as solitaire (Klondike) activate our learning circuitry regardless of whether or not we actually develop any skills by playing them—like an artificial neural network continuing to train past the point of usefulness, fluctuating around its plateau with every new example it sees.
One of the most difficult aspects of video game design is scheduling difficulty increases; ideally, a game gets harder at the same pace that the gamer gets better, because getting better feels good. Engaging those learning circuits is one of the primary reasons games are fun in the first place.
The real question to ask is whether learning bad habits in video games translates even a little bit to real life. This is an age-old debate, most frequently brought up when somebody claims that playing violent first-person shooters turns innocent children into mass-murdering psychopaths.
But we learn by working too—especially if work is somewhat playful. Yes videogames teach us, but they teach us while producing absolutely nothing, whereas work teaches us, may be less fun, but actually produces value (makes the world better, you get paid rather than paying, etc.) And what you learn at work (or any productive enterprise—maybe it’s an art project for Burning Man, or a self-improvement project) is much more likely to be what you need to know for future work. Whereas what you learn in a game may happen to be useful later, but also may not.
I agree with everything you said. We should be especially cautious about playing so-called casual games, where the modal player very quickly reaches a plateau beyond which he or she will learn nothing, much less learn anything useful.
The difference of course is that the learning process in Real Life is slooooow. In game X, after 30 hours of play, the modal player may be one or two orders of magnitude better at a given skill (one that is at least somewhat unique to the game) than someone who has been playing for two hours. Some games (e.g., some first-person shooters) require no unique skills; I suspect the skill curve looks similar, but most players are hundreds of hours along it rather than just a few, so the curve appears flatter and the differences proportionally smaller.
Contrast that to life: in mine, the skills I am trying to cultivate are the same ones that I’ve been trying to cultivate for years, and my improvement is sometimes so glacial that I feel doubt as to whether I’m getting better at all. I could just be thousands of hours along similarly shaped curves, but I have certainly reached the point where I no longer see incremental improvement: all I see anymore are occasional insights that discretely improve my abilities.
So, Playey Work is much better for us and for the world at large than Workey Play. The difficulty comes in valuing that value; truly internalizing that need to produce and be valued so that it overwhelms the additional fun that video games offer and will always be able to offer.
I earnestly want to know how to do that. If there’s an optimization that could be applied to humans, I think that eliminating cognitive biases would do less to improve the human condition than increasing the weight we each internally attach to producing value for others. For just me, if I could do that, through meditation or cognitive behavior therapy or hypnosis or whatever, I would finish my dissertation right quick and just as quickly move on to doing the things that I know I want to do, but right now want less than to spend a dozen hours playing another video game.
Play is about learning. Even games that we don’t think of as teaching us anything are fundamentally tied into our learning circuits. Even games as mindless as solitaire (Klondike) activate our learning circuitry regardless of whether or not we actually develop any skills by playing them—like an artificial neural network continuing to train past the point of usefulness, fluctuating around its plateau with every new example it sees.
One of the most difficult aspects of video game design is scheduling difficulty increases; ideally, a game gets harder at the same pace that the gamer gets better, because getting better feels good. Engaging those learning circuits is one of the primary reasons games are fun in the first place.
The real question to ask is whether learning bad habits in video games translates even a little bit to real life. This is an age-old debate, most frequently brought up when somebody claims that playing violent first-person shooters turns innocent children into mass-murdering psychopaths.
But we learn by working too—especially if work is somewhat playful. Yes videogames teach us, but they teach us while producing absolutely nothing, whereas work teaches us, may be less fun, but actually produces value (makes the world better, you get paid rather than paying, etc.) And what you learn at work (or any productive enterprise—maybe it’s an art project for Burning Man, or a self-improvement project) is much more likely to be what you need to know for future work. Whereas what you learn in a game may happen to be useful later, but also may not.
Playey Work >> Worky Play
I agree with everything you said. We should be especially cautious about playing so-called casual games, where the modal player very quickly reaches a plateau beyond which he or she will learn nothing, much less learn anything useful.
The difference of course is that the learning process in Real Life is slooooow. In game X, after 30 hours of play, the modal player may be one or two orders of magnitude better at a given skill (one that is at least somewhat unique to the game) than someone who has been playing for two hours. Some games (e.g., some first-person shooters) require no unique skills; I suspect the skill curve looks similar, but most players are hundreds of hours along it rather than just a few, so the curve appears flatter and the differences proportionally smaller.
Contrast that to life: in mine, the skills I am trying to cultivate are the same ones that I’ve been trying to cultivate for years, and my improvement is sometimes so glacial that I feel doubt as to whether I’m getting better at all. I could just be thousands of hours along similarly shaped curves, but I have certainly reached the point where I no longer see incremental improvement: all I see anymore are occasional insights that discretely improve my abilities.
So, Playey Work is much better for us and for the world at large than Workey Play. The difficulty comes in valuing that value; truly internalizing that need to produce and be valued so that it overwhelms the additional fun that video games offer and will always be able to offer.
I earnestly want to know how to do that. If there’s an optimization that could be applied to humans, I think that eliminating cognitive biases would do less to improve the human condition than increasing the weight we each internally attach to producing value for others. For just me, if I could do that, through meditation or cognitive behavior therapy or hypnosis or whatever, I would finish my dissertation right quick and just as quickly move on to doing the things that I know I want to do, but right now want less than to spend a dozen hours playing another video game.