I don’t have any supporting citation for you premise.
But the fundamental abstraction: someone writing a character, is in essence running a simulation of that character.
That seems completely reasonable to me, the main difference between that and an LLM doing it, would be that, humans lack the computational resources to get enough fidelity to call that character a person.
hmm I think a point was lost in translation then. if I was a person named Sally and I write a character named Dave, then I as a whole am the person who is pretending to be Dave; Sally is also just a character, after all, the true reality of what I am is a hunk of cells working together using the genetic and memetic code that produces a structure that can encode language which labels its originator Sally or Dave. similarly with an ai, it’s not that the ai is simulating a character so much as that the ai is a hunk of silicon that has the memetic code necessary to output the results of personhood.
I’m not convinced. Imagine that someone’s neurons stopped functioning, and you were running around shrunken inside their brains moving around neurotransmitters to make their brain function. When they act intelligently, is it really your intelligence?
If you’re Sally and you write a character Dave in the detail described here, you are acting as a processor executing a series of dumb steps that make up a Dave program. Whether the Dave program is intelligent is separate from whether you are intelligent.
not really? Dave is virtualized, not emulated. When acting as a writer, Sally uses almost all the same faculties as if writing about herself when she writes about Dave.
I don’t have any supporting citation for you premise.
But the fundamental abstraction: someone writing a character, is in essence running a simulation of that character.
That seems completely reasonable to me, the main difference between that and an LLM doing it, would be that, humans lack the computational resources to get enough fidelity to call that character a person.
hmm I think a point was lost in translation then. if I was a person named Sally and I write a character named Dave, then I as a whole am the person who is pretending to be Dave; Sally is also just a character, after all, the true reality of what I am is a hunk of cells working together using the genetic and memetic code that produces a structure that can encode language which labels its originator Sally or Dave. similarly with an ai, it’s not that the ai is simulating a character so much as that the ai is a hunk of silicon that has the memetic code necessary to output the results of personhood.
I’m not convinced. Imagine that someone’s neurons stopped functioning, and you were running around shrunken inside their brains moving around neurotransmitters to make their brain function. When they act intelligently, is it really your intelligence?
If you’re Sally and you write a character Dave in the detail described here, you are acting as a processor executing a series of dumb steps that make up a Dave program. Whether the Dave program is intelligent is separate from whether you are intelligent.
not really? Dave is virtualized, not emulated. When acting as a writer, Sally uses almost all the same faculties as if writing about herself when she writes about Dave.