Tentative theory: whether a person develops learned blankness has a lot to do with their early experiences in an area. Early experiences have something to do with innate talent—perhaps the ability to notice relevant distinctions.
This is something I’ve been pondering for a while and never been able to explain to my satisfaction.
I think society sets up the wrong expectations for interacting with computers. I see two categories of things—“people things”, and “nature things”.
People things would be stuff like paper forms, or communication skills, or shopping.
Nature things would be stuff like a garden (thanks to efm on irc!), or physics.
Computing has a bit of the characteristics of both, but needs to be treated more like a nature thing. Whereas it’s often actually treated like a people thing.
I’m just kinda musing here, I don’t have any explanation of this I’m happy with.
I normally think in terms of social and technical skills, which is similar to this distinction but carves at different spots. In other words, there are problems where the ability to manipulate cognitive systems into a desired state is useful, and problems where the ability to manipulate non-cognitive systems into a desired state is useful.
A lot of people seem to define themselves as good at one area and bad at the other, as though the two were mutually inhibitory. There’s a connection here to gender roles, as well… social skills are more tightly associated with femininity and technical skills with masculinity, at least in the U.S.
People who define themselves as being good at social skills and bad at technical skills will be “not good with computers” in the same way they will be “not good with cars.”
There’s also an overlap with a class distinction here, at least in the U.S. Many blue-collar people who are “good with cars” will nevertheless not be “good with computers” because computers are associated with a different class. (This might be a matter of limited exposure, or might be a class-signaling thing, or both.)
My intuition is mostly the opposite, specifically that “bad with computers” people often treat applications like some gigantic, arbitrary natural system with lots of rules to memorize, instead of artifacts created by people who are often trying to communicate function and purpose through every orifice in the interface.
It only makes sense to ask the what the words in the menus actually mean if you assume they are the product of some person who is using them as a communication channel.
It’s perhaps more like maths. There’s an element of human communication, and an element of underlying truths.
I think there’s a problem in education.
I’ve learnt computers through Computer Science based education, so I don’t have personal experience of this, but I’m told that computing education for non-specialists is very much focused on learning by rote, “these are the exact steps to do X”, with no attempt to understand the system in general.
Thus, when people have any problem outside the very specific examples they’ve learnt, they can’t cope.
The next question is, obviously, why is computing education structured like this?
My theories:
A lot of education works like this. We generally believe far too much in rote learning. Rote learning is probably more suited to situations that don’t change too much, but is deployed in computing where the details you might rote learn are likely to change drastically in a relatively small number of years.
People don’t like thinking about computing. They want to do the minimum necessary to accomplish their non-computing task. However they make a falsely small estimate of the amount of computing knowledge required for this, and actually end up making their task more difficult.
Tentative theory: whether a person develops learned blankness has a lot to do with their early experiences in an area. Early experiences have something to do with innate talent—perhaps the ability to notice relevant distinctions.
Experience certainly seems relevant.
This is something I’ve been pondering for a while and never been able to explain to my satisfaction.
I think society sets up the wrong expectations for interacting with computers. I see two categories of things—“people things”, and “nature things”.
People things would be stuff like paper forms, or communication skills, or shopping.
Nature things would be stuff like a garden (thanks to efm on irc!), or physics.
Computing has a bit of the characteristics of both, but needs to be treated more like a nature thing. Whereas it’s often actually treated like a people thing.
I’m just kinda musing here, I don’t have any explanation of this I’m happy with.
I normally think in terms of social and technical skills, which is similar to this distinction but carves at different spots. In other words, there are problems where the ability to manipulate cognitive systems into a desired state is useful, and problems where the ability to manipulate non-cognitive systems into a desired state is useful.
A lot of people seem to define themselves as good at one area and bad at the other, as though the two were mutually inhibitory. There’s a connection here to gender roles, as well… social skills are more tightly associated with femininity and technical skills with masculinity, at least in the U.S.
People who define themselves as being good at social skills and bad at technical skills will be “not good with computers” in the same way they will be “not good with cars.”
There’s also an overlap with a class distinction here, at least in the U.S. Many blue-collar people who are “good with cars” will nevertheless not be “good with computers” because computers are associated with a different class. (This might be a matter of limited exposure, or might be a class-signaling thing, or both.)
My intuition is mostly the opposite, specifically that “bad with computers” people often treat applications like some gigantic, arbitrary natural system with lots of rules to memorize, instead of artifacts created by people who are often trying to communicate function and purpose through every orifice in the interface.
It only makes sense to ask the what the words in the menus actually mean if you assume they are the product of some person who is using them as a communication channel.
It’s perhaps more like maths. There’s an element of human communication, and an element of underlying truths.
I think there’s a problem in education.
I’ve learnt computers through Computer Science based education, so I don’t have personal experience of this, but I’m told that computing education for non-specialists is very much focused on learning by rote, “these are the exact steps to do X”, with no attempt to understand the system in general.
Thus, when people have any problem outside the very specific examples they’ve learnt, they can’t cope.
The next question is, obviously, why is computing education structured like this?
My theories:
A lot of education works like this. We generally believe far too much in rote learning. Rote learning is probably more suited to situations that don’t change too much, but is deployed in computing where the details you might rote learn are likely to change drastically in a relatively small number of years.
People don’t like thinking about computing. They want to do the minimum necessary to accomplish their non-computing task. However they make a falsely small estimate of the amount of computing knowledge required for this, and actually end up making their task more difficult.