It seems to me that for a lot of emotional processing, the presence of another human helps you articulate your thoughts, but most of the value is getting to better articulate things to yourself. Many characterizations of what it’s like to be a “good listener”, for example, are about being a person who says very little
Amusingly, I kind of wrote this essay with the help of ChatGPT. Writing a Real Essay felt like too much work, but then I thought, maybe if I describe to ChatGPT what kind of an essay I have in mind, it could improve it and we could iterate on it together.
It turned out that “describing the kind of an essay I’d like to write to ChatGPT” did 96% of the work. Since when I looked at that description of “this is what I’d like to say”, I concluded that I could just copy-paste most of it into the final piece directly, and didn’t really need ChatGPT to do anything anymore.
Disclaimer: I run an “AI companion” app, which has fulfilled the role of a romantic partner for a handful of people.
This is the main benefit I see of talking about your issues with an AI. Current-gen (RLHF tuned) LLMs are fantastic at therapy-esque conversations, acting as a mirror to allow the human to reflect on their own thoughts and beliefs. Their weakpoint (as a conversational partner) right now is lacking agency and consistent views of their own, but that’s not what everyone needs.
See also: Rubber duck debugging.
Amusingly, I kind of wrote this essay with the help of ChatGPT. Writing a Real Essay felt like too much work, but then I thought, maybe if I describe to ChatGPT what kind of an essay I have in mind, it could improve it and we could iterate on it together.
It turned out that “describing the kind of an essay I’d like to write to ChatGPT” did 96% of the work. Since when I looked at that description of “this is what I’d like to say”, I concluded that I could just copy-paste most of it into the final piece directly, and didn’t really need ChatGPT to do anything anymore.
So that was pretty rubber duck-y.
Maybe we can refer to these systems as cybernetic or cyborg rubber ducking? :)
Silicon ducking? Cyberducking?
Disclaimer: I run an “AI companion” app, which has fulfilled the role of a romantic partner for a handful of people.
This is the main benefit I see of talking about your issues with an AI. Current-gen (RLHF tuned) LLMs are fantastic at therapy-esque conversations, acting as a mirror to allow the human to reflect on their own thoughts and beliefs. Their weakpoint (as a conversational partner) right now is lacking agency and consistent views of their own, but that’s not what everyone needs.