Does “value the welfare of others” necessarily mean “consciously value the welfare of others”? Is it wrong to say “I know how to interpret human sounds into language and meaning” just because I can do it? Or do I have to demonstrate I know how because I can deconstruct the process to the point that I can write an algorithm (or computer code) to do it?
The idea that we cannot value the welfare of computers seems ludicrously naive and misinterpretative. If I can value the welfare of a stranger, then clearly the thing for which I value welfare is not defined too tightly. If a computer (running the right program) displays some of the features that signal me that a human is something i should value, why couldn’t I value the computer? We watch animated shows and value and have empathy for all sorts of animated entities. In all sorts of stories we have empathy for robots or other mechanical things. The idea that we cannot value the welfare of a computer flies in the face of the evidence that we can empathize with all sorts of non-human things fictional and real. In real life, we value and have human-like empathy for animals, fishes, and even plants in many cases.
I think the interpretations or assumptions behind this paper are bad ones. Certainly, they are not brought out explicitly and argued for.
It might also be argued that people playing with computers cannot help behaving as if they were playing with humans. However, this interpretation would: (i) be inconsistent with other studies showing that people discriminate behaviorally, neurologically, and physiologically between humans and computers when playing simpler games (19, 56–58), (ii) not explain why behavior significantly correlated with understanding (Fig. 2B and Tables S3 and S4)...”
((iii) and (iv) apply to the general case of “people behave as if they are playing with humans”, but not to the specific case of “people behave as if they are playing with humans, because of empathy with the computer”).
The idea that we cannot value the welfare of computers seems ludicrously naive and misinterpretative.
I am always up for being ludicrous :-P
So, what is the welfare of a computer? Does it involve a well-regulated power supply? Good ventilation in a case? Is overclocking an example of inhumane treatment?
Or maybe you want to talk about software and the awful assault on its dignity by an invasive debugger...
Does “value the welfare of others” necessarily mean “consciously value the welfare of others”? Is it wrong to say “I know how to interpret human sounds into language and meaning” just because I can do it? Or do I have to demonstrate I know how because I can deconstruct the process to the point that I can write an algorithm (or computer code) to do it?
The idea that we cannot value the welfare of computers seems ludicrously naive and misinterpretative. If I can value the welfare of a stranger, then clearly the thing for which I value welfare is not defined too tightly. If a computer (running the right program) displays some of the features that signal me that a human is something i should value, why couldn’t I value the computer? We watch animated shows and value and have empathy for all sorts of animated entities. In all sorts of stories we have empathy for robots or other mechanical things. The idea that we cannot value the welfare of a computer flies in the face of the evidence that we can empathize with all sorts of non-human things fictional and real. In real life, we value and have human-like empathy for animals, fishes, and even plants in many cases.
I think the interpretations or assumptions behind this paper are bad ones. Certainly, they are not brought out explicitly and argued for.
I actually read the paper.
((iii) and (iv) apply to the general case of “people behave as if they are playing with humans”, but not to the specific case of “people behave as if they are playing with humans, because of empathy with the computer”).
I am always up for being ludicrous :-P
So, what is the welfare of a computer? Does it involve a well-regulated power supply? Good ventilation in a case? Is overclocking an example of inhumane treatment?
Or maybe you want to talk about software and the awful assault on its dignity by an invasive debugger...