Your phrasing gets me thinking about the subtle differences between “having”, “expressing”, “forming”, “considering”, and other verbs that we use about opinions. I expect that there should be a line where person-ish entities can do some of these opinion verbs and non-person-ish entities cannot, but I find that line surprisingly difficult to articulate.
Haven’t we anthropomorphized organizations as having opinions, to some degree, for awhile now? I think that’s most analogous to the way I understand LLMs to possess opinions: aggregated from the thought and actions of many discrete contributors, without undergoing synthesis that we’d call conscious.
When I as an individual have an opinion, that opinion is built from a mix of the opinions of others and my own firsthand perceptions of the world. The opinion comes from inputs across different levels. When a corporation or LLM has an opinion, I think it’s reasonable to claim that such an opinion synthesized from a bunch of inputs that are on the same level? A corporation as an entity doesn’t have firsthand experiences, although the individuals who compose it have experiences of it, so it has a sort of secondhand experience of itself instead of a firsthand one. From how ChatGPT has talked when I’ve talked with it, I get the impression that it has a similarly secondhand self-experience.
This points out to me that many/most humans get a large part of their own self-image or self-perception secondhand from those around them, as well. In my experience, individuals with more of that often make much better and more predictable acquaintances than those with less.
Your phrasing gets me thinking about the subtle differences between “having”, “expressing”, “forming”, “considering”, and other verbs that we use about opinions. I expect that there should be a line where person-ish entities can do some of these opinion verbs and non-person-ish entities cannot, but I find that line surprisingly difficult to articulate.
Haven’t we anthropomorphized organizations as having opinions, to some degree, for awhile now? I think that’s most analogous to the way I understand LLMs to possess opinions: aggregated from the thought and actions of many discrete contributors, without undergoing synthesis that we’d call conscious.
When I as an individual have an opinion, that opinion is built from a mix of the opinions of others and my own firsthand perceptions of the world. The opinion comes from inputs across different levels. When a corporation or LLM has an opinion, I think it’s reasonable to claim that such an opinion synthesized from a bunch of inputs that are on the same level? A corporation as an entity doesn’t have firsthand experiences, although the individuals who compose it have experiences of it, so it has a sort of secondhand experience of itself instead of a firsthand one. From how ChatGPT has talked when I’ve talked with it, I get the impression that it has a similarly secondhand self-experience.
This points out to me that many/most humans get a large part of their own self-image or self-perception secondhand from those around them, as well. In my experience, individuals with more of that often make much better and more predictable acquaintances than those with less.