Zuck and Musk point to energy as a quickly approaching deep learning bottleneck over and above compute.
This to me seems like it could slow takeoff substantially and effectively create a wall for a long time.
Best arguments against this?
You can compute where energy is cheap, then send the results (e.g. weights, inference) on where ever needed.
But Amazon just bought rented half a nuclear power plant (1GW) near Pennsylvania, so maybe it doesn’t make sense now.
i don’t think the constraint is that energy is too expensive? i think we just literally don’t have enough of it concentrated in one place
but i have no idea actually
Zuck and Musk point to energy as a quickly approaching deep learning bottleneck over and above compute.
This to me seems like it could slow takeoff substantially and effectively create a wall for a long time.
Best arguments against this?
You can compute where energy is cheap, then send the results (e.g. weights, inference) on where ever needed.
But Amazon just bought rented half a nuclear power plant (1GW) near Pennsylvania, so maybe it doesn’t make sense now.
i don’t think the constraint is that energy is too expensive? i think we just literally don’t have enough of it concentrated in one place
but i have no idea actually