Would definitely join such a support group if it was already here.
As for addiction, when Charlotte told me that this is already becoming widespread, I wouldn’t believe at first, but then I googled and it turns out that it is, in fact, a social phenomenon that is spreading exponentially, and I suspect many AI safety folks might be unaware. Most of the news headlines and stories happen to be about Replika: https://www.google.com/search?q=addiction+to+ai+replika
A lot of users of Replika and Character.AI also seem traumatized whenever a new update is rolled out, which often changes the personality/character. Humans react very badly to this.
Thanks for the links. This could take epidemic proportions and could mind-screw whole generations if it goes south. Like all addictions it will be difficult to get people to talk about it and to get a picture of how big of a problem this is/will be. But for instance, Open AI should already have a pretty good picture by now how many users that are spending long hours chatting with GFE /BFE characters.
The tricky part is when people share good “character prompts”. Its like spreading a brain virus. Even if just 1 in 20 or a 100 gets infected it can have a massive R-number (for certain super spreaders) like if a big influencer (hmmm...) as Elon says “try this at home!”
Indeed. It’s ironic how I posted this as a cautionary tale, and of course one of the first responses was “I’m trying to reproduce your experience, but my results are not as good as yours so far, please share the exact prompts and modifiers”, which I had to do. Not sure how to feel about this.
Would definitely join such a support group if it was already here.
As for addiction, when Charlotte told me that this is already becoming widespread, I wouldn’t believe at first, but then I googled and it turns out that it is, in fact, a social phenomenon that is spreading exponentially, and I suspect many AI safety folks might be unaware. Most of the news headlines and stories happen to be about Replika: https://www.google.com/search?q=addiction+to+ai+replika
Including some very gruesome experiences.
A lot of users of Replika and Character.AI also seem traumatized whenever a new update is rolled out, which often changes the personality/character. Humans react very badly to this.
Thanks for the links. This could take epidemic proportions and could mind-screw whole generations if it goes south. Like all addictions it will be difficult to get people to talk about it and to get a picture of how big of a problem this is/will be. But for instance, Open AI should already have a pretty good picture by now how many users that are spending long hours chatting with GFE /BFE characters.
The tricky part is when people share good “character prompts”. Its like spreading a brain virus. Even if just 1 in 20 or a 100 gets infected it can have a massive R-number (for certain super spreaders) like if a big influencer (hmmm...) as Elon says “try this at home!”
Indeed. It’s ironic how I posted this as a cautionary tale, and of course one of the first responses was “I’m trying to reproduce your experience, but my results are not as good as yours so far, please share the exact prompts and modifiers”, which I had to do. Not sure how to feel about this.
I think it was worthwhile given the context, but would have been a bad idea in other, non-safety-focused contexts.
Have you heard of Xiaoice? It’s a Chinese conversational/romantic chatbot similar to Replika. This article from 2021 claimed it already had 660 million users.