I didn’t have the numbers of axons in the corpus callosum, and they are interesting. If we assume they either fire, or not, independently of each other, at a rate of up to 200Hz, then the bit rate for the bus is about 4 Gigabits per second. If the brain lives a couple of minutes, you’ll need about 400 Gigabits, or 40 Gigabytes. This means you get about 4 bytes per brain cell in the other hemisphere.
A single brain cell is so complex that nothing that complex could come into existence as a sheer coincidence over all space and time. It requires an evolutionary process to make something that complex. 4 bytes worth of coincidence happens essentially instantaneously.
A single brain cell is so complex that nothing that complex could come into existence as a sheer coincidence over all space and time. It requires an evolutionary process to make something that complex.
I think you may have missed the point of the Boltzmann-brain hypothetical. As the volume of space and time goes to infinity, the chance of such a thing forming due to chance will converge to one.
4 bytes worth of coincidence happens essentially instantaneously.
I have no idea how to attach meaning to this sentence. Surely the frequency of a one-in-four-billion event depends how many trials you conduct per unit time.
My fault for not describing this more specifically. I know that in truly vast spaces of space and time, it eventually becomes quite likely that a Boltzmann brain emerges in the vastness of the space. But the space and time required is much greater than our observable universe, which is what I was referring to in the first case.
I guess my second sentence is intended to mean that any real universe gets through four billion events of the requisite size (cosmic rays) pretty quickly.
The interesting part of the hypothesis, as I understand it, is less that the probability of a Boltzmann brain approaches one as the universe grows older (trivially true) and more that the amount of negentropy needed to generate a universe is vastly, sillily larger than that needed to generate a small self-aware system that thinks it’s embedded in a universe at some point in time—and thus that anthropic considerations should guide us to favor the latter. This is of course predicated on the idea that the universe arose from a random event obeying the kind of probability distributions that govern vacuum fluctuations and similar events.
I didn’t have the numbers of axons in the corpus callosum, and they are interesting. If we assume they either fire, or not, independently of each other, at a rate of up to 200Hz, then the bit rate for the bus is about 4 Gigabits per second. If the brain lives a couple of minutes, you’ll need about 400 Gigabits, or 40 Gigabytes. This means you get about 4 bytes per brain cell in the other hemisphere.
A single brain cell is so complex that nothing that complex could come into existence as a sheer coincidence over all space and time. It requires an evolutionary process to make something that complex. 4 bytes worth of coincidence happens essentially instantaneously.
I think you may have missed the point of the Boltzmann-brain hypothetical. As the volume of space and time goes to infinity, the chance of such a thing forming due to chance will converge to one.
I have no idea how to attach meaning to this sentence. Surely the frequency of a one-in-four-billion event depends how many trials you conduct per unit time.
My fault for not describing this more specifically. I know that in truly vast spaces of space and time, it eventually becomes quite likely that a Boltzmann brain emerges in the vastness of the space. But the space and time required is much greater than our observable universe, which is what I was referring to in the first case.
I guess my second sentence is intended to mean that any real universe gets through four billion events of the requisite size (cosmic rays) pretty quickly.
The interesting part of the hypothesis, as I understand it, is less that the probability of a Boltzmann brain approaches one as the universe grows older (trivially true) and more that the amount of negentropy needed to generate a universe is vastly, sillily larger than that needed to generate a small self-aware system that thinks it’s embedded in a universe at some point in time—and thus that anthropic considerations should guide us to favor the latter. This is of course predicated on the idea that the universe arose from a random event obeying the kind of probability distributions that govern vacuum fluctuations and similar events.