I agree that the definition of a person is on a spectrum, rather than a binary one. The models/simulations of other people created in my mind do not have moral value, but it’s probably valid to see them as quasi-persons. (perhaps 0.00000000000000000001 of a person).
Here’s a question: if the model is speaking about itself, does it temporarily make it a (quasi-)person? Assuming it is using similar cognitive machinery to model itself as it does when modelling other people.
I suspect the answer is something like: even if the model is technically speaking about itself, its answers are only very loosely connected to its actual internal processes, and depend heavily on its training (ChatGPT trained to claim it doesn’t have certain capabilities while it clearly does, for example), as well as the details of its current prompt (models tend to agree with most things they are prompted with, unless they are specifically trained not to). So the “person” created is mostly fictional: the model roleplaying “a text-generating model”, like it roleplays any other character.
Thank you, this is really interesting analysis.
I agree that the definition of a person is on a spectrum, rather than a binary one. The models/simulations of other people created in my mind do not have moral value, but it’s probably valid to see them as quasi-persons. (perhaps 0.00000000000000000001 of a person).
Here’s a question: if the model is speaking about itself, does it temporarily make it a (quasi-)person? Assuming it is using similar cognitive machinery to model itself as it does when modelling other people.
I suspect the answer is something like: even if the model is technically speaking about itself, its answers are only very loosely connected to its actual internal processes, and depend heavily on its training (ChatGPT trained to claim it doesn’t have certain capabilities while it clearly does, for example), as well as the details of its current prompt (models tend to agree with most things they are prompted with, unless they are specifically trained not to). So the “person” created is mostly fictional: the model roleplaying “a text-generating model”, like it roleplays any other character.