At first blush this appears to guarantee Anthropic access to enough compute for the next couple of training iterations, at least. I infer the larger training runs are back on.
At first blush this appears to guarantee Anthropic access to enough compute for the next couple of training iterations, at least. I infer the larger training runs are back on.