There’s some motte/bailey to this argument, between different levels of effect. With AI, the crux is timelines. It’s looking like in late 2025 or early 2026 there will be gigawatt-scale training systems that cost $15-$50 billion and are capable of training a model with 100-400 times GPT-4 compute in a few months, or of running 100-400 GPT-4 scale experiments. Perhaps this doesn’t move the needle on TAI timelines, but it seems too early to tell.
There’s some motte/bailey to this argument, between different levels of effect. With AI, the crux is timelines. It’s looking like in late 2025 or early 2026 there will be gigawatt-scale training systems that cost $15-$50 billion and are capable of training a model with 100-400 times GPT-4 compute in a few months, or of running 100-400 GPT-4 scale experiments. Perhaps this doesn’t move the needle on TAI timelines, but it seems too early to tell.