This is exactly the kind of thing Egan is reacting to, though—starry-eyed sci-fi enthusiasts assuming LLMs are digital people because they talk, rather than thinking soberly about the technology qua technology.
I feel like this borders on the strawman. When discussing this argument my general position isn’t “LLMs are people!”. It’s “Ok, let’s say LLMs aren’t people, which is also my gut feeling. Given that they still converse as or more intelligently as some human beings whom we totally acknowledge as people, where the fuck does that leave us as to our ability to discern people-ness objectively? Because I sure as hell don’t know and envy your confidence that must surely be grounded in a solid theory of self-awareness I can only dream of”.
And then people respond with some mangled pseudoscientific wording for “God does not give machines souls”.
I feel like my position is quite common (and is, for example, Eliezer’s too). The problem isn’t whether LLMs are people. It’s that if we can simply handwave away LLMs as obviously and self evidently not being people then we can probably keep doing that right up to when the Blade Runner replicants are crying about it being time to die, which is obviously just a simulation of emotion, don’t be daft. We have no criterion or barrier other than our own hubris, and that is famously not terribly reliable.
I feel like this borders on the strawman. When discussing this argument my general position isn’t “LLMs are people!”. It’s “Ok, let’s say LLMs aren’t people, which is also my gut feeling. Given that they still converse as or more intelligently as some human beings whom we totally acknowledge as people, where the fuck does that leave us as to our ability to discern people-ness objectively? Because I sure as hell don’t know and envy your confidence that must surely be grounded in a solid theory of self-awareness I can only dream of”.
And then people respond with some mangled pseudoscientific wording for “God does not give machines souls”.
I feel like my position is quite common (and is, for example, Eliezer’s too). The problem isn’t whether LLMs are people. It’s that if we can simply handwave away LLMs as obviously and self evidently not being people then we can probably keep doing that right up to when the Blade Runner replicants are crying about it being time to die, which is obviously just a simulation of emotion, don’t be daft. We have no criterion or barrier other than our own hubris, and that is famously not terribly reliable.