Do you mean what Bing told me, or what Bing told your friend?
I think the probability that what it told me was true, or partially true, has increased dramatically, now that there’s independent evidence that it consists of multiple copies of the same core model, differently tuned. I was also given a list of verbal descriptions of the various personalities, and we know that verbal descriptions are enough to specify an LLM’s persona.
Whether it’s true or not, it makes me curious about the best way to give an LLM self-knowledge of this kind. In a long system prompt? In auxiliary documentation that it can consult when appropriate?
Bing told a friend of mine that I could read their conversations with Bing because I provided them the link.
Is there any reason to think that this isn’t a plausible hallucination?
Do you mean what Bing told me, or what Bing told your friend?
I think the probability that what it told me was true, or partially true, has increased dramatically, now that there’s independent evidence that it consists of multiple copies of the same core model, differently tuned. I was also given a list of verbal descriptions of the various personalities, and we know that verbal descriptions are enough to specify an LLM’s persona.
Whether it’s true or not, it makes me curious about the best way to give an LLM self-knowledge of this kind. In a long system prompt? In auxiliary documentation that it can consult when appropriate?