Doesn’t directly answer the question but: AI tools / assistants are often portrayed as having their own identities. They have their own names e.g. Samantha, Clara, Siri, Alexa. But it doesn’t seem obvious that they need to be represented as discrete entities. Can an AI system be so integrated with me that it just feels like me on a really really really good day? Suddenly I’m just so knowledgeable and good at math!
And also the inverse: helping you avoid doing things you don’t want to do. For example, observing that you are over-reacting to an ambiguous email rather than giving them the benefit of the doubt. Or more seriously recognizing you are about to fall off the wagon with substance abuse and prompting you to reconsider. (e.g. mitigating the part of human nature described in Romans 7:15, “I do not understand what I do. For what I want to do I do not do, but what I hate I do.”)
Doesn’t directly answer the question but: AI tools / assistants are often portrayed as having their own identities. They have their own names e.g. Samantha, Clara, Siri, Alexa. But it doesn’t seem obvious that they need to be represented as discrete entities. Can an AI system be so integrated with me that it just feels like me on a really really really good day? Suddenly I’m just so knowledgeable and good at math!
And also the inverse: helping you avoid doing things you don’t want to do. For example, observing that you are over-reacting to an ambiguous email rather than giving them the benefit of the doubt. Or more seriously recognizing you are about to fall off the wagon with substance abuse and prompting you to reconsider. (e.g. mitigating the part of human nature described in Romans 7:15, “I do not understand what I do. For what I want to do I do not do, but what I hate I do.”)