I also don’t think this is a concern. It’s just analogy, metaphor, figurative language, which is more or less what the human mind runs on. I also don’t think it leads to real anthropomorphization in the minds of those using it; it’s more just a useful shorthand. Compare something I overheard once about atoms of a certain reactive element “wanting” to bond with other atoms. I don’t think either party was ascribing agency to those atoms in this case; rather, “it wants X” is commonly understood as a useful shorthand for “it behaves as if it wanted X”.
Thus it is common to hear hardware or software talked about as though it has homunculi talking to each other inside it, with intentions and desires. Thus, one hears “The protocol handler got confused”, or that programs “are trying” to do things, or one may say of a routine that “its goal in life is to X”. Or: “You can’t run those two cards on the same bus; they fight over interrupt 9.”
One even hears explanations like “… and its poor little brain couldn’t understand X, and it died.” Sometimes modelling things this way actually seems to make them easier to understand, perhaps because it’s instinctively natural to think of anything with a really complex behavioral repertoire as ‘like a person’ rather than ‘like a thing’.
At first glance, to anyone who understands how these programs actually work, this seems like an absurdity. As hackers are among the people who know best how these phenomena work, it seems odd that they would use language that seems to ascribe consciousness to them. The mind-set behind this tendency thus demands examination.
The key to understanding this kind of usage is that it isn’t done in a naive way; hackers don’t personalize their stuff in the sense of feeling empathy with it, nor do they mystically believe that the things they work on every day are ‘alive’. To the contrary: hackers who anthropomorphize are expressing not a vitalistic view of program behavior but a mechanistic view of human behavior.
I also don’t think this is a concern. It’s just analogy, metaphor, figurative language, which is more or less what the human mind runs on. I also don’t think it leads to real anthropomorphization in the minds of those using it; it’s more just a useful shorthand. Compare something I overheard once about atoms of a certain reactive element “wanting” to bond with other atoms. I don’t think either party was ascribing agency to those atoms in this case; rather, “it wants X” is commonly understood as a useful shorthand for “it behaves as if it wanted X”.
Edit: see also: http://catb.org/jargon/html/anthropomorphization.html
--”Fabulous Prizes”, Dresden Codak
And, while we’re on the subject, here’s a classic:
-- Sidney Morgenbesser to B. F. Skinner
(via Eliezer, natch.)