Following bogus, I could imagine endorsing a weaker form of the argument: not that it’s like nothing to be a bat, but that it’s like less to be a bat than to be a human.
In fact, if you’ve ever wondered why you happen to be the person you are, and not someone else, it may be that the reflectivity you are displaying by asking this question puts you in a more-strongly-anthropically-weighted reference class.
Given 10 billion bats , that bats have been around for 50 million years, and bat generations taking let’s say 5 years, and assuming that population has been stable for evolutionary history, we have a super rough estimate of something on the order of (10B * (50M/5)) = 100 quadrillion historical bats. I think a lot of anthropic calculations assume there have been 100 billion historical humans, so probability of being a human is 1⁄1 millionth the probability of being a bat.
I don’t see a whole lot of difference between not having subjective experiences and having one one-millionth the subjective experience of a human. Once we expand this to all animals instead of just bats, the animals come out even worse.
I’m not sure it follows that a bat has one one-millionth the subjective experience of a human. The problem is that you can’t necessarily add a bunch of bat-experiences together to get something equivalent to a human experience; in fact, it seems to me that this sort of additivity only holds when the experiences are coherently connected to each other. (If someone hooked up a million bat-brains into a giant network, then it might make sense to ask “Why am I a human, rather than a million bats”?)
So it may be, for instance, that each bat has 10% the subjective experience of a human, but that that extra 90% makes it millions of times more probable that the experiencer will be pondering this question.
Is there a difference between having no subjective experience and having one-millionth the subjective experience of a Tra’bilfin, which are advanced aliens with artificially augmented brains capable of a million times the processing of a current human?
Following bogus, I could imagine endorsing a weaker form of the argument: not that it’s like nothing to be a bat, but that it’s like less to be a bat than to be a human.
In fact, if you’ve ever wondered why you happen to be the person you are, and not someone else, it may be that the reflectivity you are displaying by asking this question puts you in a more-strongly-anthropically-weighted reference class.
Given 10 billion bats , that bats have been around for 50 million years, and bat generations taking let’s say 5 years, and assuming that population has been stable for evolutionary history, we have a super rough estimate of something on the order of (10B * (50M/5)) = 100 quadrillion historical bats. I think a lot of anthropic calculations assume there have been 100 billion historical humans, so probability of being a human is 1⁄1 millionth the probability of being a bat.
I don’t see a whole lot of difference between not having subjective experiences and having one one-millionth the subjective experience of a human. Once we expand this to all animals instead of just bats, the animals come out even worse.
I’m not sure it follows that a bat has one one-millionth the subjective experience of a human. The problem is that you can’t necessarily add a bunch of bat-experiences together to get something equivalent to a human experience; in fact, it seems to me that this sort of additivity only holds when the experiences are coherently connected to each other. (If someone hooked up a million bat-brains into a giant network, then it might make sense to ask “Why am I a human, rather than a million bats”?)
So it may be, for instance, that each bat has 10% the subjective experience of a human, but that that extra 90% makes it millions of times more probable that the experiencer will be pondering this question.
Is there a difference between having no subjective experience and having one-millionth the subjective experience of a Tra’bilfin, which are advanced aliens with artificially augmented brains capable of a million times the processing of a current human?
You don’t have any issues quantifying over fractions of subjective experience? I haven’t begun to have a clear idea what that even means.