Human relationships should be challenging. Refusing to be challenged by those around you is what creates the echo chambers we see online, where your own opinions get fed back to you, only reassuring you of what you already believe. These were created by AI recommendation algorithms whose only goal was to maximise engagement.
Why would an AI boyfriend or girlfriend be any different? They would not help you develop as a person, they would only exist to serve your desires, not to push you to improve who you are, not to teach you new perspectives, not to give you opportunities to bring others joy.
I understand all this logically, but my emotional brain asks, “Yeah, but why should I care about any of that? I want what I want. I don’t want to grow, or improve myself, or learn new perspectives, or bring others joy. I want to feel good all the time with minimal effort.”
When wireheading—real wireheading, not the creepy electrode in the brain sort that few people would actually accept—is presented to you, it is very hard to reject it, particularly if you have a background of trauma or neurodivergence that makes coping with “real life” difficult to begin with, which is why so many people with brains like mine end up as addicts. Actually, by some standards, I am an addict, just not of any physical substance.
And to be honest, as a risk-averse person, it’s hard for me to rationally argue for why I ought to interact with other people when AIs are better, except the people I already know, trust, and care about. Like, where exactly is my duty to “grow” (from other people’s perspective, by other people’s definitions, because they tell me I ought to do it) supposed to be coming from? The only thing that motivates me, sometimes, to try to do growth-and-self-improvement things is guilt. And I’m actually a pretty hard person to guilt into doing things.
Human relationships should be challenging. Refusing to be challenged by those around you is what creates the echo chambers we see online, where your own opinions get fed back to you, only reassuring you of what you already believe. These were created by AI recommendation algorithms whose only goal was to maximise engagement.
Why would an AI boyfriend or girlfriend be any different? They would not help you develop as a person, they would only exist to serve your desires, not to push you to improve who you are, not to teach you new perspectives, not to give you opportunities to bring others joy.
I understand all this logically, but my emotional brain asks, “Yeah, but why should I care about any of that? I want what I want. I don’t want to grow, or improve myself, or learn new perspectives, or bring others joy. I want to feel good all the time with minimal effort.”
When wireheading—real wireheading, not the creepy electrode in the brain sort that few people would actually accept—is presented to you, it is very hard to reject it, particularly if you have a background of trauma or neurodivergence that makes coping with “real life” difficult to begin with, which is why so many people with brains like mine end up as addicts. Actually, by some standards, I am an addict, just not of any physical substance.
And to be honest, as a risk-averse person, it’s hard for me to rationally argue for why I ought to interact with other people when AIs are better, except the people I already know, trust, and care about. Like, where exactly is my duty to “grow” (from other people’s perspective, by other people’s definitions, because they tell me I ought to do it) supposed to be coming from? The only thing that motivates me, sometimes, to try to do growth-and-self-improvement things is guilt. And I’m actually a pretty hard person to guilt into doing things.