I originally wanted to talk about this Bing disaster with my (not very AI-invested) friends because one of them asked what a less aligned version of ChatGPT would look like… but I suppose I won’t be doing that now.
I think we have to consider the potential panic this disaster might cause (I know a couple of people who probably would believe the AI that it was sentient if it told them, and I would want to avoid telling a friend who then tells them without thinking). So in my mind, the less people learn of this disaster before access is limited, the better. I have a feeling Microsoft is probably going to take this offline for a while once they realize that the abuse potential can’t be restricted by just making the program refuse certain questions… if not, well, we tried, and I’ll be enjoying the chaos.
This was extremely helpful.
I originally wanted to talk about this Bing disaster with my (not very AI-invested) friends because one of them asked what a less aligned version of ChatGPT would look like… but I suppose I won’t be doing that now.
I think we have to consider the potential panic this disaster might cause (I know a couple of people who probably would believe the AI that it was sentient if it told them, and I would want to avoid telling a friend who then tells them without thinking). So in my mind, the less people learn of this disaster before access is limited, the better. I have a feeling Microsoft is probably going to take this offline for a while once they realize that the abuse potential can’t be restricted by just making the program refuse certain questions… if not, well, we tried, and I’ll be enjoying the chaos.