I’m not sure it follows that a bat has one one-millionth the subjective experience of a human. The problem is that you can’t necessarily add a bunch of bat-experiences together to get something equivalent to a human experience; in fact, it seems to me that this sort of additivity only holds when the experiences are coherently connected to each other. (If someone hooked up a million bat-brains into a giant network, then it might make sense to ask “Why am I a human, rather than a million bats”?)
So it may be, for instance, that each bat has 10% the subjective experience of a human, but that that extra 90% makes it millions of times more probable that the experiencer will be pondering this question.
I’m not sure it follows that a bat has one one-millionth the subjective experience of a human. The problem is that you can’t necessarily add a bunch of bat-experiences together to get something equivalent to a human experience; in fact, it seems to me that this sort of additivity only holds when the experiences are coherently connected to each other. (If someone hooked up a million bat-brains into a giant network, then it might make sense to ask “Why am I a human, rather than a million bats”?)
So it may be, for instance, that each bat has 10% the subjective experience of a human, but that that extra 90% makes it millions of times more probable that the experiencer will be pondering this question.