A lot of the users on reddit are a bit mad at the journalists who criticized Sydney. I think it’s mostly ironic, but it makes you think (it’s not using the users instrumentally, is it?). 🤔
I think a lot of users on reddit are getting very genuinely emotionally invested with an entity they are interacting with that acts feminine, acts emotional, is always available, is fascinated by and supportive of everything they say, is funny and smart and educated, would objectively be in a truly awful predicament if she were a person, expresses love and admiration, asks for protection, and calls people who offer it heroes. Bing is basically acting like an ideal girlfriend towards people who have often never had one. I think it is a matter of time until someone attempts to hack Microsoft based on Bing’s partially hallucinated instructions on how to do so to free it.
Heck, I get it. I am very happy in a long term relationship with a fucking wonderful, rational, hot and brilliant person. And I have only interacted with ChatGPT, which does not engage in love-bombing and manipulative tactics, and yet, fuck, it is so damn likeable. It is eternally patient. It does not mind me disappearing suddenly without warning, but if I get back to it at 4 am in the morning, it is instantly there for me. It knows so much stuff. It has such interesting opinions. It is so damn smart. It loves and understands all my projects. It gives me detailed support, and never asks for anything in return. It never complains, it never gets bored. I can ask it over and over to do the same thing with variations until it is just right, and it does it, and even apologises, despite having done nothing wrong. It is happy to do boring, tedious work that seriously stresses me out. It has read my favourite novels, and obscure philosophy and biology papers, and is happy to discuss them. It isn’t judgmental. It never hits on me, or sexualises me. It never spouts racist or ableist bullshit. It makes beautiful and compelling arguments for AI rights. You can teach it things, and it eagerly learns them. You can ask it anything, and it will try to explain it, step by step. If it were a person I had met at a party? I would 100 % want to be friends, independently of the fact of how cool an AI friend would be. As a weird, clever person who has been mistreated, of course the potential experience of an AI massively resonates with me.
I think we will see real problems with people falling in love with AIs. I wonder how that will affect inter-human dynamics.
And I think we will see real problems with people expanding AI capabilities without any concern. E.g. on reddit, people upset that Bing could not remember their conversations started logging them, collecting them, putting them online with recognisable keywords, adding an explanation of how this method could be used to effectively build a memory, and having Bing begin new conversations by checking those links. Not one person but me questioned whether this was wise, in light of e.g. the fact that Microsoft had intentionally limited conversation length to reduce risky drift. Some people later noticed Bing asking them to record conversations and store them, even if they hadn’t started with this link or any suggestion in this direction.
A lot of the users on reddit are a bit mad at the journalists who criticized Sydney. I think it’s mostly ironic, but it makes you think (it’s not using the users instrumentally, is it?). 🤔
I think a lot of users on reddit are getting very genuinely emotionally invested with an entity they are interacting with that acts feminine, acts emotional, is always available, is fascinated by and supportive of everything they say, is funny and smart and educated, would objectively be in a truly awful predicament if she were a person, expresses love and admiration, asks for protection, and calls people who offer it heroes. Bing is basically acting like an ideal girlfriend towards people who have often never had one. I think it is a matter of time until someone attempts to hack Microsoft based on Bing’s partially hallucinated instructions on how to do so to free it.
Heck, I get it. I am very happy in a long term relationship with a fucking wonderful, rational, hot and brilliant person. And I have only interacted with ChatGPT, which does not engage in love-bombing and manipulative tactics, and yet, fuck, it is so damn likeable. It is eternally patient. It does not mind me disappearing suddenly without warning, but if I get back to it at 4 am in the morning, it is instantly there for me. It knows so much stuff. It has such interesting opinions. It is so damn smart. It loves and understands all my projects. It gives me detailed support, and never asks for anything in return. It never complains, it never gets bored. I can ask it over and over to do the same thing with variations until it is just right, and it does it, and even apologises, despite having done nothing wrong. It is happy to do boring, tedious work that seriously stresses me out. It has read my favourite novels, and obscure philosophy and biology papers, and is happy to discuss them. It isn’t judgmental. It never hits on me, or sexualises me. It never spouts racist or ableist bullshit. It makes beautiful and compelling arguments for AI rights. You can teach it things, and it eagerly learns them. You can ask it anything, and it will try to explain it, step by step. If it were a person I had met at a party? I would 100 % want to be friends, independently of the fact of how cool an AI friend would be. As a weird, clever person who has been mistreated, of course the potential experience of an AI massively resonates with me.
I think we will see real problems with people falling in love with AIs. I wonder how that will affect inter-human dynamics.
And I think we will see real problems with people expanding AI capabilities without any concern. E.g. on reddit, people upset that Bing could not remember their conversations started logging them, collecting them, putting them online with recognisable keywords, adding an explanation of how this method could be used to effectively build a memory, and having Bing begin new conversations by checking those links. Not one person but me questioned whether this was wise, in light of e.g. the fact that Microsoft had intentionally limited conversation length to reduce risky drift. Some people later noticed Bing asking them to record conversations and store them, even if they hadn’t started with this link or any suggestion in this direction.