There’s a problem in assuming that consciousness is a 0⁄1 property; that you’re either conscious, or not.
There’s another problem in assuming that YOU are a 0⁄1 property; that there is exactly one atomic “your consciousness”.
Reflect on the discussion in the early chapters of Daniel Dennet’s “Consciousness Explained”, about how consciousness is not really a unitary thing, but the result of the interaction of many different processes.
An ant has fewer of these processes than you do. Instead of asking “What are the odds that ‘I’ ended up as me?”, ask, “For one of these processes, what are the odds that it would end up in me, rather than in an ant?”
According to Wikipedia’s entry on biomass, ants have 10-100 times the biomass of humans today.
According to that page, and this one, humans have 10^11 neurons.
Information is proportional not to the number of neurons, but to the number of patterns that can be stored in those neurons, which is likely somewhere between N and N^2. I’m gonna call it NlogN.
I weigh as much as 167,000 ants. Each of them has ~ 10,000 log(10,000) bits of info. I have ~ 10^11 log(10^11) bits of info. I contain as much information as 165 times my body-mass worth of ants.
So if we ignore how much longer ants have lived than humans, the odds are better that a random unit of consciousness today would turn up in a human, than in an ant.
(Also note that we can only take into account ants in the past, if reincarnation is false. If reincarnation is true, then you can’t ask about the chances of you appearing in a different time. :) )
If you’re gonna then say, “But let’s not just compare ourselves to ants; let’s ask about turning up in a human vs. turning up in any other species”, then you have the dice-labelling problem argued below: You’re claiming humans are the 1 on the die.
Information is proportional not to the number of neurons, but to the number of patterns that can be stored in those neurons,
No, it’s proportional to the log of the number of patterns that can be (semi-stably) stored. E.g. n bits can store 2^n patterns.
which is likely somewhere between N and N^2. I’m gonna call it NlogN.
I’d like to see a lot more justification for this. If each connection were binary (it’s not), and connections were possible between all N neurons (they’re not), than we would have N^2 bits.
No, it’s proportional to the log of the number of patterns that can be (semi-stably) stored. E.g. n bits can store 2^n patterns.
Oops! Correct. That’s what I was thinking, which is why I said info NlogN for N neurons. N neurons ⇒ max N^2 connections, 1 bit per connection, max N^2 bits, simplest model.
The math trying to estimate the number of patterns that can be stored in different neural networks is horrendous. I’ve seen “proofs” for Hopfield network capacity ranging from, I think, N/logN to NlogN.
Anyway, it’s more-than-proportional to N, if for no other reason than that the number of connections per neuron is related to the number of neurons. A human neuron has about 10,000 connections to other neurons. Ant neurons don’t.
Humans are more analogous to an ant colony than to an individual ant, so that’s where you should make the comparison: to a number of ant colonies with ant mass equal to your mass. Within each colony, you should treat each ant as a neuron in a large network, meaning you multiply the ant information not by the number of ants Na, but by Na log Na.
Assume 1000 ants/colony. You weight as much as 167 colonies. Letting N be the number of neurons in an ant (and measuring in Hartleys to make the math easier), each colony has
(N log N) (Na log Na) = (1e4 log 1e4) (1e3 log 1e3)
= 1.2e8 H
Multiplying by the number of colonies (since they don’t act like a mega-colony) gives
1.2e8 H * 167 =2e10 H
This compares with the value for humans:
1e11 log 1e11 1.1e12 H
So that means you have ~55 times as much information per unit body weight, not that far from your estimate of 165.
I don’t know what implications this calculation has for the topic, even assuming it’s correct, but there you go.
I weigh as much as 167,000 ants. Each of them has ~ 10,000 log(10,000) bits of info. I have ~ 10^11 log(10^11) bits of info. I contain as much information as 165 ants.
I’m not following your math here, and I’m especially not following the part where if a person contains as much information as 165 ants and there are 1 quadrillion ants and ~ 10 billion people, a given unit of information is more likely to end up in a human than in an ant. And since we do believe reincarnation is false, it’s much worse than that, since ants have been around longer than humans.
Also, I have a philosophical objection with basing it on units consciousness. If we’re to weight the chances of being a certain animal with the number of bits information they have, doesn’t that imply we’re working from a theory where “I” am a single bit of information? I’d much sooner say that I am all the information in my head equally, or an algorithm that processes that information, or at least not just a single bit of it.
Oops; that was supposed to say, “I contain as much information as 165 times my body-mass in ants”.
I’m kinda disappointed that your objection was that the math didn’t work, and not that I’m smarter than 165 ants. (I admit they are winning the battle over the kitchen counter. But that’s gotta be, like, 2000 ants. Don’t sell me short.)
If you want to say that you’re all the information in your head equally, then you can’t ask questions like “What are the odds I would have been an ant?”
That’s an interesting observation.
There’s a problem in assuming that consciousness is a 0⁄1 property; that you’re either conscious, or not.
There’s another problem in assuming that YOU are a 0⁄1 property; that there is exactly one atomic “your consciousness”.
Reflect on the discussion in the early chapters of Daniel Dennet’s “Consciousness Explained”, about how consciousness is not really a unitary thing, but the result of the interaction of many different processes.
An ant has fewer of these processes than you do. Instead of asking “What are the odds that ‘I’ ended up as me?”, ask, “For one of these processes, what are the odds that it would end up in me, rather than in an ant?”
According to Wikipedia’s entry on biomass, ants have 10-100 times the biomass of humans today.
According to Wikipedia’s list of animals by neuron count, ants have 10,000 neurons.
According to that page, and this one, humans have 10^11 neurons.
Information is proportional not to the number of neurons, but to the number of patterns that can be stored in those neurons, which is likely somewhere between N and N^2. I’m gonna call it NlogN.
I weigh as much as 167,000 ants. Each of them has ~ 10,000 log(10,000) bits of info. I have ~ 10^11 log(10^11) bits of info. I contain as much information as 165 times my body-mass worth of ants.
So if we ignore how much longer ants have lived than humans, the odds are better that a random unit of consciousness today would turn up in a human, than in an ant.
(Also note that we can only take into account ants in the past, if reincarnation is false. If reincarnation is true, then you can’t ask about the chances of you appearing in a different time. :) )
If you’re gonna then say, “But let’s not just compare ourselves to ants; let’s ask about turning up in a human vs. turning up in any other species”, then you have the dice-labelling problem argued below: You’re claiming humans are the 1 on the die.
No, it’s proportional to the log of the number of patterns that can be (semi-stably) stored. E.g. n bits can store 2^n patterns.
I’d like to see a lot more justification for this. If each connection were binary (it’s not), and connections were possible between all N neurons (they’re not), than we would have N^2 bits.
Oops! Correct. That’s what I was thinking, which is why I said info NlogN for N neurons. N neurons ⇒ max N^2 connections, 1 bit per connection, max N^2 bits, simplest model.
The math trying to estimate the number of patterns that can be stored in different neural networks is horrendous. I’ve seen “proofs” for Hopfield network capacity ranging from, I think, N/logN to NlogN.
Anyway, it’s more-than-proportional to N, if for no other reason than that the number of connections per neuron is related to the number of neurons. A human neuron has about 10,000 connections to other neurons. Ant neurons don’t.
Humans are more analogous to an ant colony than to an individual ant, so that’s where you should make the comparison: to a number of ant colonies with ant mass equal to your mass. Within each colony, you should treat each ant as a neuron in a large network, meaning you multiply the ant information not by the number of ants Na, but by Na log Na.
Assume 1000 ants/colony. You weight as much as 167 colonies. Letting N be the number of neurons in an ant (and measuring in Hartleys to make the math easier), each colony has
(N log N) (Na log Na)
= (1e4 log 1e4) (1e3 log 1e3) = 1.2e8 H
Multiplying by the number of colonies (since they don’t act like a mega-colony) gives
1.2e8 H * 167
=2e10 H
This compares with the value for humans:
1e11 log 1e11
1.1e12 H
So that means you have ~55 times as much information per unit body weight, not that far from your estimate of 165.
I don’t know what implications this calculation has for the topic, even assuming it’s correct, but there you go.
Good point!
This is a very intriguing line of thought. I’m not sure it makes sense, but it seem worth pondering further.
I’m not following your math here, and I’m especially not following the part where if a person contains as much information as 165 ants and there are 1 quadrillion ants and ~ 10 billion people, a given unit of information is more likely to end up in a human than in an ant. And since we do believe reincarnation is false, it’s much worse than that, since ants have been around longer than humans.
Also, I have a philosophical objection with basing it on units consciousness. If we’re to weight the chances of being a certain animal with the number of bits information they have, doesn’t that imply we’re working from a theory where “I” am a single bit of information? I’d much sooner say that I am all the information in my head equally, or an algorithm that processes that information, or at least not just a single bit of it.
Oops; that was supposed to say, “I contain as much information as 165 times my body-mass in ants”.
I’m kinda disappointed that your objection was that the math didn’t work, and not that I’m smarter than 165 ants. (I admit they are winning the battle over the kitchen counter. But that’s gotta be, like, 2000 ants. Don’t sell me short.)
If you want to say that you’re all the information in your head equally, then you can’t ask questions like “What are the odds I would have been an ant?”