Bats, with sensory systems so completely different from those of humans, must have exotic bat qualia that we could never imagine. (...) …we still have no idea what it’s like to feel a subjective echolocation quale.
(Excuse me for being off topic)
Reductionism is true; if we really know everything about a bat brain, bat quale would be included in the package. Imagine a posthuman that is able to model a bat’s brain and sensory modalities on a neural level, in its own mind. There is no way it’d find anything missing about the bat; there is no way it’d complain about persistently mysterious bat quale. It’s a fact that current humans are very bad at modeling any minds, including their own. Thus, human-level neuroscientists researching the bat’s brain are a bit like the human operator in Searle’s Chinese Room; they have access to a lot of abstract information but they’re unable to actually hold a neural model of the bat in their minds and simulate firings.
In short, I think that in this case it’s more reasonable to point to insufficiencies in brainpower before we start considering fundamental epistemological problems.
It’s a fact that current humans are very bad at modeling any minds, including their own.
“Very bad” compared to what? We are brilliant at modelling minds relative to our ability for abstract reasoning, mathematics and, say, repeating a list of 8 items we were just told in reverse order.
(Excuse me for being off topic)
Reductionism is true; if we really know everything about a bat brain, bat quale would be included in the package. Imagine a posthuman that is able to model a bat’s brain and sensory modalities on a neural level, in its own mind. There is no way it’d find anything missing about the bat; there is no way it’d complain about persistently mysterious bat quale. It’s a fact that current humans are very bad at modeling any minds, including their own. Thus, human-level neuroscientists researching the bat’s brain are a bit like the human operator in Searle’s Chinese Room; they have access to a lot of abstract information but they’re unable to actually hold a neural model of the bat in their minds and simulate firings.
In short, I think that in this case it’s more reasonable to point to insufficiencies in brainpower before we start considering fundamental epistemological problems.
“Very bad” compared to what? We are brilliant at modelling minds relative to our ability for abstract reasoning, mathematics and, say, repeating a list of 8 items we were just told in reverse order.
Trying to imagine neurons and simulating firings by doing mental arithmetic still seems to be far-fetched, which is the kind of modeling I meant.