I think you have defined me as not really talking as I am on the autism spectrum and have trouble telling emotions from tone.
No, he didn’t. Talking is not listening and there’s a big difference between being bad at understanding emotional nuance because of cognitive limitations and the information that would be necessary for understanding emotional nuance never even reaching you brain.
Was Stephen Hawking able to talk (late in life)? No, he wasn’t. He was able to write and his writing was read by a machine. Just like GPT4.
If I read a book to my daughter, does the author talk to her? No. He might be mute or dead. Writing and then having your text read by a different person or system is not talking.
But in the end, these are just words. It’s a fact that GPT4 has no control over how what it writes is read, nor can it hear how what it has written is being read.
He wrote “unless your GPT conversator is able to produce significantly different outputs when listening the same words in a different tone, I think it would be fair to classify it as not really talking.” So if that is true and I’m horribly at picking up tone and so it doesn’t impact my “outputs”, I’m not really talking.
In the broader sense of transmitting information via spoken words, of course that GPT4 hooked to a text-to-speech software can “talk”. It can talk in the same way Stephen Hawking (RIP) could talk, by passing written text to a mindless automaton reader.
I used “talking” in the sense of being able to join a conversation and exchange information not through literal text only. I am not very good at picking up tone myself, but I suppose that even people on the autism spectrum would notice a difference between someone yelling at them and someone speaking soberly, even if the spoken words are the same. And that’s definitely a skill that GPT-conversator should have if people want to use it as a personal therapist or the like (I am not saying that using GPT as a personal therapist would be a good idea anyway).
No, he didn’t. Talking is not listening and there’s a big difference between being bad at understanding emotional nuance because of cognitive limitations and the information that would be necessary for understanding emotional nuance never even reaching you brain.
Was Stephen Hawking able to talk (late in life)? No, he wasn’t. He was able to write and his writing was read by a machine. Just like GPT4.
If I read a book to my daughter, does the author talk to her? No. He might be mute or dead. Writing and then having your text read by a different person or system is not talking.
But in the end, these are just words. It’s a fact that GPT4 has no control over how what it writes is read, nor can it hear how what it has written is being read.
He wrote “unless your GPT conversator is able to produce significantly different outputs when listening the same words in a different tone, I think it would be fair to classify it as not really talking.” So if that is true and I’m horribly at picking up tone and so it doesn’t impact my “outputs”, I’m not really talking.
It’s probably better to taboo “talking” here.
In the broader sense of transmitting information via spoken words, of course that GPT4 hooked to a text-to-speech software can “talk”. It can talk in the same way Stephen Hawking (RIP) could talk, by passing written text to a mindless automaton reader.
I used “talking” in the sense of being able to join a conversation and exchange information not through literal text only. I am not very good at picking up tone myself, but I suppose that even people on the autism spectrum would notice a difference between someone yelling at them and someone speaking soberly, even if the spoken words are the same. And that’s definitely a skill that GPT-conversator should have if people want to use it as a personal therapist or the like (I am not saying that using GPT as a personal therapist would be a good idea anyway).