They cannot just add an OOM of parameters, much less three.
How about 2 OOM’s?
HW2.5 21Tflops HW3 72x2 = 72 Tflops (redundant), HW4 3x72=216Tflops (not sure about redundancy) and Elon said in June that next gen AI5 chip for fsd would be about 10x faster say ~2Pflops
By rough approximation to brain processing power you get about 0.1Pflop per gram of brain so HW2.5 might have been a 0.2g baby mouse brain, HW3 a 1g baby rat brain HW4 perhaps adult rat, and upcoming HW5 a 20g small cat brain.
As a real world analogue cat to dog (25-100g brain) seems to me the minimum necessary range of complexity based on behavioral capabilities to do a decent job of driving—need some ability to anticipate and predict motivations and behavior of other road users and something beyond dumb reactive handling (ie somewhat predictive) to understand anomalous objects that exist on and around roads.
Nvidia Blackwell B200 can do up to about 10pflops of FP8, which is getting into large dog/wolf brain processing range, and wouldn’t be unreasonable to package in a self driving car once down closer to manufacturing cost in a few years at around 1kW peak power consumption.
I don’t think the rat brain HW4 is going to cut it, and I suspect that internal to Tesla they know it too, but it’s going to be crazy expensive to own up to it, better to keep kicking the can down the road with promises until they can deliver the real thing. AI5 might just do it, but wouldn’t be surprising to need a further oom to Nvidia Blackwell equivalent and maybe $10k extra cost to get there.
They cannot just add an OOM of parameters, much less three.
How about 2 OOM’s?
HW2.5 21Tflops HW3 72x2 = 72 Tflops (redundant), HW4 3x72=216Tflops (not sure about redundancy) and Elon said in June that next gen AI5 chip for fsd would be about 10x faster say ~2Pflops
By rough approximation to brain processing power you get about 0.1Pflop per gram of brain so HW2.5 might have been a 0.2g baby mouse brain, HW3 a 1g baby rat brain HW4 perhaps adult rat, and upcoming HW5 a 20g small cat brain.
As a real world analogue cat to dog (25-100g brain) seems to me the minimum necessary range of complexity based on behavioral capabilities to do a decent job of driving—need some ability to anticipate and predict motivations and behavior of other road users and something beyond dumb reactive handling (ie somewhat predictive) to understand anomalous objects that exist on and around roads.
Nvidia Blackwell B200 can do up to about 10pflops of FP8, which is getting into large dog/wolf brain processing range, and wouldn’t be unreasonable to package in a self driving car once down closer to manufacturing cost in a few years at around 1kW peak power consumption.
I don’t think the rat brain HW4 is going to cut it, and I suspect that internal to Tesla they know it too, but it’s going to be crazy expensive to own up to it, better to keep kicking the can down the road with promises until they can deliver the real thing. AI5 might just do it, but wouldn’t be surprising to need a further oom to Nvidia Blackwell equivalent and maybe $10k extra cost to get there.