Z. M. Davis: But if you think about the things that the homunculus tends to do, I think you would find yourself needing to move to levels below the homunculus to do it. To give it a coherent set of actions it is likely to take, and not to take, at any given time, you would have to populate it with wants, with likes, with beliefs, with structures for reasoning about beliefs.
I think eventually you would come to an algorithm of which the homunculus would have to be an instantiation, and you would have to assume that that algorithm was represented somewhere.
I just don’t see how you can make sensible predictions about ontologically basic complicated things. And I know people will go on about how you can’t make predictions about a person with free will, but that’s a crock. You expect me to try to coherently answer your post. I expect a cop to arrest me if I drive too fast. More to the point, we don’t expect neurologically intact humans to spend three years walking backwards, or talk to puddles, or remove their clothing and sing “I’m a little teapot” in Times Square.
And the same goes for gods, incidentally. Religious folk will say that their gods’ ways are ineffable, that they can’t be predicted. But they still expect their gods to answer prayers, and forgive sins, and torture people like me for millennia, and they don’t expect them to transform mount everest into a roast beef sandwich, or thunder forth nursery rhymes from the heavens.
They have coherent expectations, and for those expectations to make sense you have to open the black box and put things in there. You have to postulate structure, and relationships between parts, and soon you haven’t got something ontologically basic anymore.
Z. M. Davis: But if you think about the things that the homunculus tends to do, I think you would find yourself needing to move to levels below the homunculus to do it. To give it a coherent set of actions it is likely to take, and not to take, at any given time, you would have to populate it with wants, with likes, with beliefs, with structures for reasoning about beliefs.
I think eventually you would come to an algorithm of which the homunculus would have to be an instantiation, and you would have to assume that that algorithm was represented somewhere.
I just don’t see how you can make sensible predictions about ontologically basic complicated things. And I know people will go on about how you can’t make predictions about a person with free will, but that’s a crock. You expect me to try to coherently answer your post. I expect a cop to arrest me if I drive too fast. More to the point, we don’t expect neurologically intact humans to spend three years walking backwards, or talk to puddles, or remove their clothing and sing “I’m a little teapot” in Times Square.
And the same goes for gods, incidentally. Religious folk will say that their gods’ ways are ineffable, that they can’t be predicted. But they still expect their gods to answer prayers, and forgive sins, and torture people like me for millennia, and they don’t expect them to transform mount everest into a roast beef sandwich, or thunder forth nursery rhymes from the heavens.
They have coherent expectations, and for those expectations to make sense you have to open the black box and put things in there. You have to postulate structure, and relationships between parts, and soon you haven’t got something ontologically basic anymore.