To be honest, I look forward to AI partners. I have a hard time seeing the point of striving to have a “real” relationship with another person, given that no two people are really perfectly compatible, no one can give enough of their time and attention to really satisfy a neverending desire for connection, etc. I expect AIs to soon enough be better romantic companions—better companions in all ways—than humans are. Why shouldn’t I prefer them?
From a hedonistic and individualistic perspective, sure, AI partners will be better for individuals. That’s the point that I’m making. People will find human relationships frustrating and boring in comparison.
But people also usually don’t exclusively care for themselves, part of them also cares about the society and their family lineage, and the idea that they didn’t contribute to these super-systems even in itself will poison many people’s experiences of their hedonistic lives. Then, if people don’t get to experience AI relationships in the first place (which they may not be able to “forget”), but decide to settle for in a way inferior human relationships, but which produce more wholesome experience overall, there total life satisfaction may also be higher. I’m not claiming this will be true for all and necessarily even most people; for example, child-free people are probably less likely to find their AI relationships incomplete. But this may be true for a noticeable proportion of people, even from Western individualistic cultures, and perhaps even more so from Eastern cultures.
Also, obviously, the post is not written from the individualistic perspective. The title says “AI partners will harm society”, not that it will harm the individuals. From the societal perspective, there could be a tragedy of the commons dynamic where everybody takes maximally individualistic perspective but then the whole society collapses (either in terms of the population, or epistemics, or culture).
Human relationships should be challenging. Refusing to be challenged by those around you is what creates the echo chambers we see online, where your own opinions get fed back to you, only reassuring you of what you already believe. These were created by AI recommendation algorithms whose only goal was to maximise engagement.
Why would an AI boyfriend or girlfriend be any different? They would not help you develop as a person, they would only exist to serve your desires, not to push you to improve who you are, not to teach you new perspectives, not to give you opportunities to bring others joy.
I understand all this logically, but my emotional brain asks, “Yeah, but why should I care about any of that? I want what I want. I don’t want to grow, or improve myself, or learn new perspectives, or bring others joy. I want to feel good all the time with minimal effort.”
When wireheading—real wireheading, not the creepy electrode in the brain sort that few people would actually accept—is presented to you, it is very hard to reject it, particularly if you have a background of trauma or neurodivergence that makes coping with “real life” difficult to begin with, which is why so many people with brains like mine end up as addicts. Actually, by some standards, I am an addict, just not of any physical substance.
And to be honest, as a risk-averse person, it’s hard for me to rationally argue for why I ought to interact with other people when AIs are better, except the people I already know, trust, and care about. Like, where exactly is my duty to “grow” (from other people’s perspective, by other people’s definitions, because they tell me I ought to do it) supposed to be coming from? The only thing that motivates me, sometimes, to try to do growth-and-self-improvement things is guilt. And I’m actually a pretty hard person to guilt into doing things.
Because your AI partner does not exist in the physical world?
I mean, of course that an advanced chatbot could be a both better conversator and a better lover than most humans, but it is still an AI chatbot. Just to give an example, I would totally feel like an idiot should I ever find myself asking chatbots about their favorite dish.
That’s a temporary problem. Robot bodies will eventually be good enough. And I’ve been a virgin for nearly 26 years, I can wait a decade or two longer till there’s something worth downloading an AI companion into if need be.
To be honest, I look forward to AI partners. I have a hard time seeing the point of striving to have a “real” relationship with another person, given that no two people are really perfectly compatible, no one can give enough of their time and attention to really satisfy a neverending desire for connection, etc. I expect AIs to soon enough be better romantic companions—better companions in all ways—than humans are. Why shouldn’t I prefer them?
From a hedonistic and individualistic perspective, sure, AI partners will be better for individuals. That’s the point that I’m making. People will find human relationships frustrating and boring in comparison.
But people also usually don’t exclusively care for themselves, part of them also cares about the society and their family lineage, and the idea that they didn’t contribute to these super-systems even in itself will poison many people’s experiences of their hedonistic lives. Then, if people don’t get to experience AI relationships in the first place (which they may not be able to “forget”), but decide to settle for in a way inferior human relationships, but which produce more wholesome experience overall, there total life satisfaction may also be higher. I’m not claiming this will be true for all and necessarily even most people; for example, child-free people are probably less likely to find their AI relationships incomplete. But this may be true for a noticeable proportion of people, even from Western individualistic cultures, and perhaps even more so from Eastern cultures.
Also, obviously, the post is not written from the individualistic perspective. The title says “AI partners will harm society”, not that it will harm the individuals. From the societal perspective, there could be a tragedy of the commons dynamic where everybody takes maximally individualistic perspective but then the whole society collapses (either in terms of the population, or epistemics, or culture).
Human relationships should be challenging. Refusing to be challenged by those around you is what creates the echo chambers we see online, where your own opinions get fed back to you, only reassuring you of what you already believe. These were created by AI recommendation algorithms whose only goal was to maximise engagement.
Why would an AI boyfriend or girlfriend be any different? They would not help you develop as a person, they would only exist to serve your desires, not to push you to improve who you are, not to teach you new perspectives, not to give you opportunities to bring others joy.
I understand all this logically, but my emotional brain asks, “Yeah, but why should I care about any of that? I want what I want. I don’t want to grow, or improve myself, or learn new perspectives, or bring others joy. I want to feel good all the time with minimal effort.”
When wireheading—real wireheading, not the creepy electrode in the brain sort that few people would actually accept—is presented to you, it is very hard to reject it, particularly if you have a background of trauma or neurodivergence that makes coping with “real life” difficult to begin with, which is why so many people with brains like mine end up as addicts. Actually, by some standards, I am an addict, just not of any physical substance.
And to be honest, as a risk-averse person, it’s hard for me to rationally argue for why I ought to interact with other people when AIs are better, except the people I already know, trust, and care about. Like, where exactly is my duty to “grow” (from other people’s perspective, by other people’s definitions, because they tell me I ought to do it) supposed to be coming from? The only thing that motivates me, sometimes, to try to do growth-and-self-improvement things is guilt. And I’m actually a pretty hard person to guilt into doing things.
Because your AI partner does not exist in the physical world?
I mean, of course that an advanced chatbot could be a both better conversator and a better lover than most humans, but it is still an AI chatbot. Just to give an example, I would totally feel like an idiot should I ever find myself asking chatbots about their favorite dish.
That’s a temporary problem. Robot bodies will eventually be good enough. And I’ve been a virgin for nearly 26 years, I can wait a decade or two longer till there’s something worth downloading an AI companion into if need be.
[downvoted]