The maximum bits and thus storage is proportional to the mass, but the
maximum efficiency is inversely proportional to radius. Larger systems
lose efficiency in transmission, have trouble radiating heat, and waste
vast amount of time because of speed of light delays.
So:.why think memory and computation capacity isn’t important? The data centre that will be needed to immerse 7 billion humans in VR is going to be huge—and why stop there?
The 22 milliseconds it takes light to get from one side of the Earth to the other is tiny—light speed delays are a relatively minor issue for large brains.
For heat, ideally, you use reversible computing, digitise the heat and then pipe it out cleanly. Heat is a problem for large brains—but surely not a show-stopping one.
The demand for extra storage seems substantial. Do you see any books or CDs when you look around? The human brain isn’t big enough to handly the demand, and so it outsourcing its storage and computing needs.
So:.why think memory and computation capacity isn’t important?
So memory is important, but it scales with the mass and that usually scales with volume, so there is a tradeoff. And computational capacity is actually not directly related to size, its more related to energy. But of course you can only pack so much energy into a small region before it melts.
The data centre that will be needed to immerse 7 billion humans in VR is going to be huge—and why stop there?
Yeah—I think the size argument is more against a single big global brain. But sure data centers with huge numbers of AI’s eventually—makes sense.
The 22 milliseconds it takes light to get from one side of the Earth to the other is tiny—light speed delays are a relatively minor issue for large brains.
Hmm 22 milliseconds? Light travels a little slower through fiber and there are always delays. But regardless the bigger problem is you are assuming slow human thoughtrate − 100hz. If you want to think at the limits of silicon and get thousands or millions of times accelerated, then suddenly the subjective speed of light becomes very slow indeed.
So:.why think memory and computation capacity isn’t important? The data centre that will be needed to immerse 7 billion humans in VR is going to be huge—and why stop there?
The 22 milliseconds it takes light to get from one side of the Earth to the other is tiny—light speed delays are a relatively minor issue for large brains.
For heat, ideally, you use reversible computing, digitise the heat and then pipe it out cleanly. Heat is a problem for large brains—but surely not a show-stopping one.
The demand for extra storage seems substantial. Do you see any books or CDs when you look around? The human brain isn’t big enough to handly the demand, and so it outsourcing its storage and computing needs.
So memory is important, but it scales with the mass and that usually scales with volume, so there is a tradeoff. And computational capacity is actually not directly related to size, its more related to energy. But of course you can only pack so much energy into a small region before it melts.
Yeah—I think the size argument is more against a single big global brain. But sure data centers with huge numbers of AI’s eventually—makes sense.
Hmm 22 milliseconds? Light travels a little slower through fiber and there are always delays. But regardless the bigger problem is you are assuming slow human thoughtrate − 100hz. If you want to think at the limits of silicon and get thousands or millions of times accelerated, then suddenly the subjective speed of light becomes very slow indeed.