I like the data center tour :) - I’ve actually used that in some of my posts elsewhere.
And no, I think Jupiter Brains are ruled out by physics.
The locality of physics—the speed of light, really limits the size of effective computational systems. You want them to be as small as possible.
Given the choice between a planet sized computer and one that was 10^10 smaller, the latter would probably be a better option.
The maximum bits and thus storage is proportional to the mass, but the maximum efficiency is inversely proportional to radius. Larger systems lose efficiency in transmission, have trouble radiating heat, and waste vast amount of time because of speed of light delays.
An an interesting side note, in three very separate lineages (human, elephant, cetacean), mammalian brains all grew to around the same size and then stopped. Most likely because of diminishing returns. Human brains are expensive for our body size, but whales have similar sized brains and it would be very cheap for them to make them bigger—but they don’t. Its a scaling issue—any bigger and the speed loss doesn’t justify the extra memory.
There are similar scaling issues with body sizes. Dinosaurs and prehistoric large mammals represent an upper limit—mass increases with volume, but shearing stress strengths increase only with surface area—so eventually the body becomes too heavy for any reasonable bones to support.
Similar 3d/2d scaling issues limited the maximum size of tanks, and they also apply to computers (and brains).
The maximum bits and thus storage is proportional to the mass, but the
maximum efficiency is inversely proportional to radius. Larger systems
lose efficiency in transmission, have trouble radiating heat, and waste
vast amount of time because of speed of light delays.
So:.why think memory and computation capacity isn’t important? The data centre that will be needed to immerse 7 billion humans in VR is going to be huge—and why stop there?
The 22 milliseconds it takes light to get from one side of the Earth to the other is tiny—light speed delays are a relatively minor issue for large brains.
For heat, ideally, you use reversible computing, digitise the heat and then pipe it out cleanly. Heat is a problem for large brains—but surely not a show-stopping one.
The demand for extra storage seems substantial. Do you see any books or CDs when you look around? The human brain isn’t big enough to handly the demand, and so it outsourcing its storage and computing needs.
So:.why think memory and computation capacity isn’t important?
So memory is important, but it scales with the mass and that usually scales with volume, so there is a tradeoff. And computational capacity is actually not directly related to size, its more related to energy. But of course you can only pack so much energy into a small region before it melts.
The data centre that will be needed to immerse 7 billion humans in VR is going to be huge—and why stop there?
Yeah—I think the size argument is more against a single big global brain. But sure data centers with huge numbers of AI’s eventually—makes sense.
The 22 milliseconds it takes light to get from one side of the Earth to the other is tiny—light speed delays are a relatively minor issue for large brains.
Hmm 22 milliseconds? Light travels a little slower through fiber and there are always delays. But regardless the bigger problem is you are assuming slow human thoughtrate − 100hz. If you want to think at the limits of silicon and get thousands or millions of times accelerated, then suddenly the subjective speed of light becomes very slow indeed.
I like the data center tour :) - I’ve actually used that in some of my posts elsewhere.
And no, I think Jupiter Brains are ruled out by physics.
The locality of physics—the speed of light, really limits the size of effective computational systems. You want them to be as small as possible.
Given the choice between a planet sized computer and one that was 10^10 smaller, the latter would probably be a better option.
The maximum bits and thus storage is proportional to the mass, but the maximum efficiency is inversely proportional to radius. Larger systems lose efficiency in transmission, have trouble radiating heat, and waste vast amount of time because of speed of light delays.
An an interesting side note, in three very separate lineages (human, elephant, cetacean), mammalian brains all grew to around the same size and then stopped. Most likely because of diminishing returns. Human brains are expensive for our body size, but whales have similar sized brains and it would be very cheap for them to make them bigger—but they don’t. Its a scaling issue—any bigger and the speed loss doesn’t justify the extra memory.
There are similar scaling issues with body sizes. Dinosaurs and prehistoric large mammals represent an upper limit—mass increases with volume, but shearing stress strengths increase only with surface area—so eventually the body becomes too heavy for any reasonable bones to support.
Similar 3d/2d scaling issues limited the maximum size of tanks, and they also apply to computers (and brains).
So:.why think memory and computation capacity isn’t important? The data centre that will be needed to immerse 7 billion humans in VR is going to be huge—and why stop there?
The 22 milliseconds it takes light to get from one side of the Earth to the other is tiny—light speed delays are a relatively minor issue for large brains.
For heat, ideally, you use reversible computing, digitise the heat and then pipe it out cleanly. Heat is a problem for large brains—but surely not a show-stopping one.
The demand for extra storage seems substantial. Do you see any books or CDs when you look around? The human brain isn’t big enough to handly the demand, and so it outsourcing its storage and computing needs.
So memory is important, but it scales with the mass and that usually scales with volume, so there is a tradeoff. And computational capacity is actually not directly related to size, its more related to energy. But of course you can only pack so much energy into a small region before it melts.
Yeah—I think the size argument is more against a single big global brain. But sure data centers with huge numbers of AI’s eventually—makes sense.
Hmm 22 milliseconds? Light travels a little slower through fiber and there are always delays. But regardless the bigger problem is you are assuming slow human thoughtrate − 100hz. If you want to think at the limits of silicon and get thousands or millions of times accelerated, then suddenly the subjective speed of light becomes very slow indeed.