Imagine there’s a Game of Life world with a society of evolved intelligent creatures inside it, and you’re observing the world from the outside. The creatures communicate with each other, “refer” to things in their environment, introspect on their internal state, consider counterfactuals, make decisions, etc. You from the outside can understand all these cognitive processes fully mechanistically, as tools that don’t require any ontologically fundamental special sauce to work. When they refer to things, it’s “just” some mechanism for constructing models of their environment and sharing information about it between each other or something, and you can understand that mechanism without ever being compelled to think that “fundamental deixis” or something is involved. You will even observe some of the more introspective agents formulate theories about how explaining their experience seems to require positing fundamental deixis, or fundamental free will, or fundamental qualia, but you’ll be able to tell from the outside that they’re confused, and everything about them is fully explicable without any of that stuff.
Right?
Then the obvious question is: should I think I’m different from the Game of Life creatures? Should I think that I need ontologically fundamental special sauce to do things that they seem to be able do without it?
This isn’t really a tight argument, but I’d like to see how your views deal with thought experiments of this general kind.
Imagine there’s a Game of Life world with a society of evolved intelligent creatures inside it, and you’re observing the world from the outside. The creatures communicate with each other, “refer” to things in their environment, introspect on their internal state, consider counterfactuals, make decisions, etc. You from the outside can understand all these cognitive processes fully mechanistically, as tools that don’t require any ontologically fundamental special sauce to work. When they refer to things, it’s “just” some mechanism for constructing models of their environment and sharing information about it between each other or something, and you can understand that mechanism without ever being compelled to think that “fundamental deixis” or something is involved. You will even observe some of the more introspective agents formulate theories about how explaining their experience seems to require positing fundamental deixis, or fundamental free will, or fundamental qualia, but you’ll be able to tell from the outside that they’re confused, and everything about them is fully explicable without any of that stuff.
Right?
Then the obvious question is: should I think I’m different from the Game of Life creatures? Should I think that I need ontologically fundamental special sauce to do things that they seem to be able do without it?
This isn’t really a tight argument, but I’d like to see how your views deal with thought experiments of this general kind.