“My cognitive process isn’t well understood by the person I’m interacting with, so they literally couldn’t imagine me accurately.”
This isn’t a one size fits all argument against ghosts. But it does point to a real thing. A rock isn’t a ghost. A rock is not capable of imagining me accurately, it isn’t running any algorithm remotely similar to my own, so I don’t shape my decisions based on the possibility I am actually a rock. The same goes for calculators and Eliza, no ghosts there. I suspect there are no ghosts in gpt3, but I am not sure. At least some humans are dumb and insane enough to contain no ghosts, or at least no ghosts that might be you. The problem is wispy ghosts. The solidest ghost is a detailed mind simulation of you. Wispy ghosts are found in things that are kind of thinking the same thing a little bit. Consider a couple of chimps fighting over a banana, and a couple of national governments at war. Do the chimps contain a wispy ghost of the warring nations, because a little bit of the chimps reasoning happens to generalize far beyond bananas?
Where do the faintest ghosts fade to nothing? This is the same as asking what processes are logically entangled with us.
On the other hand, I wouldn’t expect this type of argument to work between a foomed AI with grahams number compute, and one with 1kg computronium.
This causes me to be less trusting of people who seem to think I’m not smart enough to understand how they think.
I think the fact you can think that in general pushes you somewhat towards the real ghost side. You know the general pattern, if not the specific thoughts that those smarter than you might have.
This isn’t a one size fits all argument against ghosts. But it does point to a real thing. A rock isn’t a ghost. A rock is not capable of imagining me accurately, it isn’t running any algorithm remotely similar to my own, so I don’t shape my decisions based on the possibility I am actually a rock. The same goes for calculators and Eliza, no ghosts there. I suspect there are no ghosts in gpt3, but I am not sure. At least some humans are dumb and insane enough to contain no ghosts, or at least no ghosts that might be you. The problem is wispy ghosts. The solidest ghost is a detailed mind simulation of you. Wispy ghosts are found in things that are kind of thinking the same thing a little bit. Consider a couple of chimps fighting over a banana, and a couple of national governments at war. Do the chimps contain a wispy ghost of the warring nations, because a little bit of the chimps reasoning happens to generalize far beyond bananas?
Where do the faintest ghosts fade to nothing? This is the same as asking what processes are logically entangled with us.
On the other hand, I wouldn’t expect this type of argument to work between a foomed AI with grahams number compute, and one with 1kg computronium.
I think the fact you can think that in general pushes you somewhat towards the real ghost side. You know the general pattern, if not the specific thoughts that those smarter than you might have.
GPT-3 is more like a super fuzzy distribution of low resolution ghosts.