GPT-4 level models can be trained with mere thousands of GPUs. Export restrictions for a product that’s otherwise on the open market aren’t going to work at this scale, and also replacement with inferior accelerators remains feasible. But one GPT is 30x compute, and procuring 100K or 3000K GPUs (or many more in their inferior alternatives) is more reasonably a practical impossibility.
GPT-4 level models can be trained with mere thousands of GPUs. Export restrictions for a product that’s otherwise on the open market aren’t going to work at this scale, and also replacement with inferior accelerators remains feasible. But one GPT is 30x compute, and procuring 100K or 3000K GPUs (or many more in their inferior alternatives) is more reasonably a practical impossibility.
Huawei claim they are catching up on Nvidia: https://www.huaweicentral.com/ascend-910b-ai-chip-outstrips-nvidia-a100-by-20-in-tests-huawei/