Without knowing what implications this might have, I notice that the first two points against “People might neglect real romance” are analogous to arguments against “People won’t bother with work if they have a basic income” based on a “scarcity decompensation threshold” model: avoiding getting trapped in a really bad relationship/job by putting a floor on alternatives, and avoiding having so little confidence/money that you can’t put in the activation energy to engage with the pool/market to begin with.
This analogy with UBI and bad jobs doesn’t work at all. People can now always jump from bad relationships to just being single. If relationship is bad (i.e., net negative for them), just being single is better. If the human relationship is net positive but not ideal, and people consider switching to AI romance, we are moving head on in a dystopia where people stop relationships with each other almost completely because AIs are so much more compelling, always available, always show affection, etc.
By contrast, usually people cannot leave bad jobs for just being unemployed because they won’t have money to support themselves.
Also, being unemployed is probably much less addictive, comparatively speaking, than having an AI partner with whom the person is in love. Being unemployed may grow boring and then the person looks for a project or a creative activity. But the person cannot just leave their AI partner whom they love because they love them.
And AI partner startups, we should rest assured, will tune their AIs such that they always try to stay interesting to their users and users don’t grow bored by them for as long as possible. AIs will never make stupid “mistakes” like disregard, disinterest, cheating, etc., and AIs will very soon be so much smarter (erudite, creative) and more eloquent than an average human that AIs could hardly bore humans. AIs will also have a tint of imperfection just not to bore the users with their absolute perfection.
Given all this, I’m sure that most (or, at least a large portion of them) people, once falling in love with AIs, for real, will stay on the hook probably forever and will never have human relationships again. And those people who will manage to get off the hook will either have serious difficulties forming human partnerships at all, or being satisfied in them, because they “knew” how good it could be with AI.
Without knowing what implications this might have, I notice that the first two points against “People might neglect real romance” are analogous to arguments against “People won’t bother with work if they have a basic income” based on a “scarcity decompensation threshold” model: avoiding getting trapped in a really bad relationship/job by putting a floor on alternatives, and avoiding having so little confidence/money that you can’t put in the activation energy to engage with the pool/market to begin with.
This analogy with UBI and bad jobs doesn’t work at all. People can now always jump from bad relationships to just being single. If relationship is bad (i.e., net negative for them), just being single is better. If the human relationship is net positive but not ideal, and people consider switching to AI romance, we are moving head on in a dystopia where people stop relationships with each other almost completely because AIs are so much more compelling, always available, always show affection, etc.
By contrast, usually people cannot leave bad jobs for just being unemployed because they won’t have money to support themselves.
Also, being unemployed is probably much less addictive, comparatively speaking, than having an AI partner with whom the person is in love. Being unemployed may grow boring and then the person looks for a project or a creative activity. But the person cannot just leave their AI partner whom they love because they love them.
And AI partner startups, we should rest assured, will tune their AIs such that they always try to stay interesting to their users and users don’t grow bored by them for as long as possible. AIs will never make stupid “mistakes” like disregard, disinterest, cheating, etc., and AIs will very soon be so much smarter (erudite, creative) and more eloquent than an average human that AIs could hardly bore humans. AIs will also have a tint of imperfection just not to bore the users with their absolute perfection.
Given all this, I’m sure that most (or, at least a large portion of them) people, once falling in love with AIs, for real, will stay on the hook probably forever and will never have human relationships again. And those people who will manage to get off the hook will either have serious difficulties forming human partnerships at all, or being satisfied in them, because they “knew” how good it could be with AI.