I don’t think there is anything special about consciousness. “Consciousness” is what any intelligence feels from the inside, just as qualia are what sense perceptions feel like from the inside.
For qualia, that is precisely the definition of the word, and therefore says nothing to explain their existence. For consciousness, it also comes down to a definition, given a reasonable guess at what is meant by “intelligence” in this context.
I am inclined to believe that what we call “consciousness” and even “sentience” may turn out to be ideas fully as human-specific as Eliezer’s favourite example, “humour”.
There’s at least a possibility that “suffering” is almost as specific.
There’s at least a possibility that “suffering” is almost as specific.
Why? I’d expect that having a particular feeling when you’re damaging yourself and not liking that feeling would be extremely widespread. (Unless by “suffering” you mean something else than ‘nociception’, in which case can you elaborate?)
I mean something morally meaningful. I don’t think a chess computer suffers when it loses a game, no matter how sophisticated. I expect that self-driving cars are programmed to try to avoid accidents even when other drivers drive badly, but I don’t think they suffer if you crash into them.
Well, I wouldn’t usually call the thing a chess computer or a self-driving car is minimizing “suffering” (though I could if I feel like using more anthropomorphizing language than usual). But I’m confused by this, because I have no problem using that word to refer to a sensation felt by a chimp, a dog, or even an insect, and I’m not sure what is that an insect has and a chess computer hasn’t that causes this intuition of mine. Maybe the fact that we share a common ancestor, and our nociception capabilities are synapomorphic with each other… but then I think even non-evolutionists would agree a dog can suffer, so it must be something else.
I don’t think there is anything special about consciousness. “Consciousness” is what any intelligence feels from the inside, just as qualia are what sense perceptions feel like from the inside.
For qualia, that is precisely the definition of the word, and therefore says nothing to explain their existence. For consciousness, it also comes down to a definition, given a reasonable guess at what is meant by “intelligence” in this context.
What is this “inside”?
I am inclined to believe that what we call “consciousness” and even “sentience” may turn out to be ideas fully as human-specific as Eliezer’s favourite example, “humour”.
There’s at least a possibility that “suffering” is almost as specific.
Why? I’d expect that having a particular feeling when you’re damaging yourself and not liking that feeling would be extremely widespread. (Unless by “suffering” you mean something else than ‘nociception’, in which case can you elaborate?)
I mean something morally meaningful. I don’t think a chess computer suffers when it loses a game, no matter how sophisticated. I expect that self-driving cars are programmed to try to avoid accidents even when other drivers drive badly, but I don’t think they suffer if you crash into them.
Yeah, if by “suffering” you mean “nociception I care about”, it sure is human-specific.
I’d find this more informative if you explicitly addressed my examples?
Well, I wouldn’t usually call the thing a chess computer or a self-driving car is minimizing “suffering” (though I could if I feel like using more anthropomorphizing language than usual). But I’m confused by this, because I have no problem using that word to refer to a sensation felt by a chimp, a dog, or even an insect, and I’m not sure what is that an insect has and a chess computer hasn’t that causes this intuition of mine. Maybe the fact that we share a common ancestor, and our nociception capabilities are synapomorphic with each other… but then I think even non-evolutionists would agree a dog can suffer, so it must be something else.