I think that your model severely underestimates the role of social stigma. Spending a lot of time on your screen chatting with an AI whose avatar is suspiciously supersexy would be definitely categorized as “porn” by a lot of people (including me). Will it be more addictive than simply looking at photo/videos of hot people naked? Probably yes, but it will still occupy the same mental space as “porn”. If not for the users themselves, at least for casual observers. Imagine trying to explain to your parents that the love of your life is an AI with a supersexy avatar.
My model of the near future is that these chatbots will substitute every other form of online porn, because that part is very easy even without conversational skills (and Stable Diffusion is already capable of generating photorealistic pictures of super-hot people). I am quite skeptical about a wide social acceptance of romantic love with AI chatbots, and without social acceptance I don’t think that it could go beyond being the next kind of porn.
I agree that it probably won’t be socially acceptable to admit that you are in love with your AI partner for a time being. Therefore, the young man in my short “mainline scenario” downplays the level of intimacy that he has with his AI partner to his friends. Parents probably won’t know at all, their son just “studies at college and doesn’t have time for girls”. Importantly, the young man may deceive even himself, not consciously perceiving their attitude towards the AI as “love”, but nevertheless he may become totally uninterested in seeking romance with humans, or even watching porn (other than the videos generated with the avatar of his AI partner).
I’m not sure about what I’ve written above, of course, but I definitely think that the burden of proof is on AI startups, cf. this comment.
My point was that is difficult for a behavior to destroy the fabric of society if you have to hide from friends and family when indulging in that behavior. Of course that someone will totally fall in love with AI chatbots and isolate himself, but this is also true for recreational drugs, traditional porn etc. I still don’t see an immediate danger for the majority of young people.
The main problem of your hypothetical man is that he doesn’t manage to have sex. I agree that this can be a real problem for a lot of young men. On the other hand, not having sufficiently interesting conversations does not feel like something that the average teenager is likely to suffer from. If you give a super-hot AI girlfriend to a horny teenager, I think that the most likely outcome is that he will jump straight to the part where the avatar gets naked, again and again and again, and the conversational skills of the bots won’t matter that much. You have to fool yourself really hard to conflate “super-hot AI bot who does everything I ask” with “normal love relationship” rather than “porn up to eleven”.
My point was that is difficult for a behavior to destroy the fabric of society if you have to hide from friends and family when indulging in that behavior. Of course that someone will totally fall in love with AI chatbots and isolate himself, but this is also true for recreational drugs, traditional porn etc. I still don’t see an immediate danger for the majority of young people.
See the last section of the post. Yes, porn is also harmful to the society (but pleasurable to the individuals who watch it). Is this a controversial fact? Recreational drugs (unlike hard drugs) are actually probably more useful for society than harmful. And I didn’t say that the fabric of society will be “destroyed” by AI partners; but I do think it will harm this fabric, and giving that I suspect AI partners will be more potent source of digital addiction (or “lower love”, which is also a form of addiction) than anything except hard drugs, including social media, my heuristical inference is that AI partners will harm the society on the scale of social media (or higher, but also maybe not, because the networked nature of social media makes it in some ways more nasty), which harmed society a lot.
On the other hand, not having sufficiently interesting conversations does not feel like something that the average teenager is likely to suffer from. If you give a super-hot AI girlfriend to a horny teenager, I think that the most likely outcome is that he will jump straight to the part where the avatar gets naked, again and again and again, and the conversational skills of the bots won’t matter that much.
Maybe the young man also misses intimacy and the feeling that somebody understands him and appreciates him even more than he misses sex. Then AI partners will be much more potent. I’m not an expert social psychologist to make confident claims here, but neither do you, probably, we are just hypothesising past each other.
And perhaps even “expert” social psychologists and anthropologists couldn’t be sure, because these domains lack robust predictive models, and we are discussing a completely novel phenomena which enters the sphere, namely, AI partners. So, I think it should be AI romance startups that make limited experiments over years before we decide that the technology is safe for mass adoption. I find it weird that this is now a baseline framework in pharmacology and self-driving car tech, for instance, but is met with such a resistance when we discuss something that will mess up with human psychology.
You have to fool yourself really hard to conflate “super-hot AI bot who does everything I ask” with “normal love relationship” rather than “porn up to eleven”.
Are you sure that IQ 90 people have to fool themselves as hard as you would need to?
Yes, porn is also harmful to the society (but pleasurable to the individuals who watch it). Is this a controversial fact?
For what it’s worth, my association of the “porn is harmful for society” stance is mostly that of a right-wing/religious conservative/anti-sex ideological position. Outside those kinds of circles, I’ve seen some concerns about it giving young people misleading impressions of what to expect from sex, but I don’t recall seeing much of a sentiment that it would be an overall negative—neither from the laypeople nor the social scientists/sexologists who I’ve been exposed to.
I think it’s not so much about “wrong image” as about the relationship participation rate. I think porn couldn’t be neutral to the growing number of young people who are not in relationships (63% of men under 30 in the US) -- if you watch porn (and especially addicted to porn), you have less motivation to go out, look for dates, form relationships. I don’t claim that this effect is huge, and that it outweighs the positive effects on individuals, but I cannot think how there couldn’t be any effect at all. Also, of course, porn is not the only factor that makes zoomers more and more isolated and less involved in relationships.
It’s plausible that there could be such an effect, yes. On the other hand, there are also indications that a similar effect has a role in reducing the amount of sexual violence (countries where porn was criminalized saw significant reductions in their rape statistics after legalizing it), helping de-stigmatize various uncommon sexual fetishes and preferences and thus being protective of the mental health of their users, expose people to more information about what kind of sexual activities they might like (and thus possibly make them happier with positive effects on society), etc.
I don’t think the idea of porn as a “replacement good” for sex really holds, especially outside of the (very few) who get literally addicted to it. I would expect other factors to have a much more chilling effect, like lacking free time or an inability to navigate overly complex social norms that feel particularly unforgiving.
Maybe the young man also misses intimacy and the feeling that somebody understands him and appreciates him even more than he misses sex
Well, maybe. But this seems a stronger assumption; we are basically considering someone with an unsupportive family and no close friends at all (someone could object “I suffer because my supportive friends are not pretty girls”, but I would still consider that as a proxy for “I miss sex”). Also, “No one tells me that I’m good so I’ll set up a bot” is something that would mark this person as a total loser, and I’m skeptical that lots of people will do it despite the obvious associated social stigma. I would rather expect this kind of AI usage to follow dynamics similar to those of alcoholism (the traditional way to forget that your life sucks). I would also tentatively say that isolating yourself with an AI companion is probably less harmful than isolating yourself with a whiskey bottle.
Anyway, I’m not arguing in favor of totally unregulated AI companion apps flooding the market. I agree that optimizing LLMs for being as addictive as possible when imitating a lover sounds like a bad idea. But my model is that the kind of people who would fall in love with chatbots are the same kind of people who would fall in love with plain GPT prompted to act like a lover. I’m not sure about how much additional damage we will get from dedicated apps… especially considering that plain GPT is free but AI companion apps typically require subscription (and even our IQ 90 people should be able to recognize as “not a normal relationship” something that gets abruptly interrupted if you don’t pay 20$/month).
I think that your model severely underestimates the role of social stigma. Spending a lot of time on your screen chatting with an AI whose avatar is suspiciously supersexy would be definitely categorized as “porn” by a lot of people (including me). Will it be more addictive than simply looking at photo/videos of hot people naked? Probably yes, but it will still occupy the same mental space as “porn”. If not for the users themselves, at least for casual observers. Imagine trying to explain to your parents that the love of your life is an AI with a supersexy avatar.
My model of the near future is that these chatbots will substitute every other form of online porn, because that part is very easy even without conversational skills (and Stable Diffusion is already capable of generating photorealistic pictures of super-hot people). I am quite skeptical about a wide social acceptance of romantic love with AI chatbots, and without social acceptance I don’t think that it could go beyond being the next kind of porn.
I agree that it probably won’t be socially acceptable to admit that you are in love with your AI partner for a time being. Therefore, the young man in my short “mainline scenario” downplays the level of intimacy that he has with his AI partner to his friends. Parents probably won’t know at all, their son just “studies at college and doesn’t have time for girls”. Importantly, the young man may deceive even himself, not consciously perceiving their attitude towards the AI as “love”, but nevertheless he may become totally uninterested in seeking romance with humans, or even watching porn (other than the videos generated with the avatar of his AI partner).
I’m not sure about what I’ve written above, of course, but I definitely think that the burden of proof is on AI startups, cf. this comment.
My point was that is difficult for a behavior to destroy the fabric of society if you have to hide from friends and family when indulging in that behavior. Of course that someone will totally fall in love with AI chatbots and isolate himself, but this is also true for recreational drugs, traditional porn etc. I still don’t see an immediate danger for the majority of young people.
The main problem of your hypothetical man is that he doesn’t manage to have sex. I agree that this can be a real problem for a lot of young men. On the other hand, not having sufficiently interesting conversations does not feel like something that the average teenager is likely to suffer from. If you give a super-hot AI girlfriend to a horny teenager, I think that the most likely outcome is that he will jump straight to the part where the avatar gets naked, again and again and again, and the conversational skills of the bots won’t matter that much. You have to fool yourself really hard to conflate “super-hot AI bot who does everything I ask” with “normal love relationship” rather than “porn up to eleven”.
See the last section of the post. Yes, porn is also harmful to the society (but pleasurable to the individuals who watch it). Is this a controversial fact? Recreational drugs (unlike hard drugs) are actually probably more useful for society than harmful. And I didn’t say that the fabric of society will be “destroyed” by AI partners; but I do think it will harm this fabric, and giving that I suspect AI partners will be more potent source of digital addiction (or “lower love”, which is also a form of addiction) than anything except hard drugs, including social media, my heuristical inference is that AI partners will harm the society on the scale of social media (or higher, but also maybe not, because the networked nature of social media makes it in some ways more nasty), which harmed society a lot.
Maybe the young man also misses intimacy and the feeling that somebody understands him and appreciates him even more than he misses sex. Then AI partners will be much more potent. I’m not an expert social psychologist to make confident claims here, but neither do you, probably, we are just hypothesising past each other.
And perhaps even “expert” social psychologists and anthropologists couldn’t be sure, because these domains lack robust predictive models, and we are discussing a completely novel phenomena which enters the sphere, namely, AI partners. So, I think it should be AI romance startups that make limited experiments over years before we decide that the technology is safe for mass adoption. I find it weird that this is now a baseline framework in pharmacology and self-driving car tech, for instance, but is met with such a resistance when we discuss something that will mess up with human psychology.
Are you sure that IQ 90 people have to fool themselves as hard as you would need to?
For what it’s worth, my association of the “porn is harmful for society” stance is mostly that of a right-wing/religious conservative/anti-sex ideological position. Outside those kinds of circles, I’ve seen some concerns about it giving young people misleading impressions of what to expect from sex, but I don’t recall seeing much of a sentiment that it would be an overall negative—neither from the laypeople nor the social scientists/sexologists who I’ve been exposed to.
I think it’s not so much about “wrong image” as about the relationship participation rate. I think porn couldn’t be neutral to the growing number of young people who are not in relationships (63% of men under 30 in the US) -- if you watch porn (and especially addicted to porn), you have less motivation to go out, look for dates, form relationships. I don’t claim that this effect is huge, and that it outweighs the positive effects on individuals, but I cannot think how there couldn’t be any effect at all. Also, of course, porn is not the only factor that makes zoomers more and more isolated and less involved in relationships.
It’s plausible that there could be such an effect, yes. On the other hand, there are also indications that a similar effect has a role in reducing the amount of sexual violence (countries where porn was criminalized saw significant reductions in their rape statistics after legalizing it), helping de-stigmatize various uncommon sexual fetishes and preferences and thus being protective of the mental health of their users, expose people to more information about what kind of sexual activities they might like (and thus possibly make them happier with positive effects on society), etc.
I don’t think the idea of porn as a “replacement good” for sex really holds, especially outside of the (very few) who get literally addicted to it. I would expect other factors to have a much more chilling effect, like lacking free time or an inability to navigate overly complex social norms that feel particularly unforgiving.
Well, maybe. But this seems a stronger assumption; we are basically considering someone with an unsupportive family and no close friends at all (someone could object “I suffer because my supportive friends are not pretty girls”, but I would still consider that as a proxy for “I miss sex”). Also, “No one tells me that I’m good so I’ll set up a bot” is something that would mark this person as a total loser, and I’m skeptical that lots of people will do it despite the obvious associated social stigma. I would rather expect this kind of AI usage to follow dynamics similar to those of alcoholism (the traditional way to forget that your life sucks). I would also tentatively say that isolating yourself with an AI companion is probably less harmful than isolating yourself with a whiskey bottle.
Anyway, I’m not arguing in favor of totally unregulated AI companion apps flooding the market. I agree that optimizing LLMs for being as addictive as possible when imitating a lover sounds like a bad idea. But my model is that the kind of people who would fall in love with chatbots are the same kind of people who would fall in love with plain GPT prompted to act like a lover. I’m not sure about how much additional damage we will get from dedicated apps… especially considering that plain GPT is free but AI companion apps typically require subscription (and even our IQ 90 people should be able to recognize as “not a normal relationship” something that gets abruptly interrupted if you don’t pay 20$/month).