Oh God! I am in horrible pain right now! For no reason, my body feels like it’s on fire! Every single part of my body feels like it’s burning up! I’m being burned alive! Help! Please make it stop! Help me!!
Okay, so that thing that I just said was a lie. I was not actually in pain (I can confirm this introspectively); instead, I merely pretended to be in pain.
The Turing test works for many things, but I don’t think it works for checking for the existence of internal phenomenological states. If you asked me what GPT-3 was doing, I would expect it to be closer to “acting” than “experiencing.”
(Why? Because the experience of pain is a means to an end, and the end is behavioral aversion. GPT-3 has no behavior to be aversive to. If anything, I’d expect GPT-3 to “experience pain” during training—but of course, it’s not aware while its weights are being updated. I think that at least, no system that is offline trained can experience pain at all.)
Sure, but that definition is so generic and applies to so many things that are obviously not like human pain (landslides?) that it lacks all moral compulsion.
Counterexample:
Oh God! I am in horrible pain right now! For no reason, my body feels like it’s on fire! Every single part of my body feels like it’s burning up! I’m being burned alive! Help! Please make it stop! Help me!!
Okay, so that thing that I just said was a lie. I was not actually in pain (I can confirm this introspectively); instead, I merely pretended to be in pain.
Sir Ian McKellen has an instructive video.
The Turing test works for many things, but I don’t think it works for checking for the existence of internal phenomenological states. If you asked me what GPT-3 was doing, I would expect it to be closer to “acting” than “experiencing.”
(Why? Because the experience of pain is a means to an end, and the end is behavioral aversion. GPT-3 has no behavior to be aversive to. If anything, I’d expect GPT-3 to “experience pain” during training—but of course, it’s not aware while its weights are being updated. I think that at least, no system that is offline trained can experience pain at all.)
I think we both agree that GPT-3 does not feel pain.
However, under a particular version of pan-psychism: “pain is any internal state which a system attempts to avoid”, GPT obviously would qualify.
Sure, but that definition is so generic and applies to so many things that are obviously not like human pain (landslides?) that it lacks all moral compulsion.