My summary of your argument: In order to guess the nature of AI experience, you look at the feelings or lack of feelings accompanying certain kinds of human cognition. The cognition involved with “love, attraction, friendship, delight, anger, hate, disgust, frustration” has feelings onboard; the cognition involved with sequence prediction does not; the AI only does sequence prediction; therefore it has no feelings. Is that an accurate summary?
The argument is that feelings or valence more broadly in humans requires additional machinery (amygdala, hypothalamus, etc). If the machinery is missing, the pain/fear/.../valence is missing although the sequence learning works just fine.
AI is missing this machinery, therefore it is extremely unlikely to experience pain/fear/.../valence.
My summary of your argument: In order to guess the nature of AI experience, you look at the feelings or lack of feelings accompanying certain kinds of human cognition. The cognition involved with “love, attraction, friendship, delight, anger, hate, disgust, frustration” has feelings onboard; the cognition involved with sequence prediction does not; the AI only does sequence prediction; therefore it has no feelings. Is that an accurate summary?
No.
The argument is that feelings or valence more broadly in humans requires additional machinery (amygdala, hypothalamus, etc). If the machinery is missing, the pain/fear/.../valence is missing although the sequence learning works just fine.
AI is missing this machinery, therefore it is extremely unlikely to experience pain/fear/.../valence.