the delta for power efficiency is currently ~1000 times in favor of brains
⇒
brain: ~20 W,
AGI: ~20kW,
kWh in Germany: 0,33 Euro
20 kWh: ~6 Euro
⇒ running our AGI would, if we are assuming that your description of the situation is correct, cost around 6 Euros in energy per hour, which is cheaper than a human worker.
So … while I don’t assume that such estimates need to be correct or apply to an AGI (that doesn’t exist yet) I don’t think you are making a very convincing point so far.
We’re talking about the scenario of “the ASI wouldn’t be able to afford the compute to remain in existence on stolen computers and stolen money”.
There are no 20 kilowatt personal computers in existence. Note that you cannot simply botnet them together as the activations for current neural networks require too much bandwidth between nodes for the machine to operate at useful timescales.
I am assuming an ASI needs more compute and resources than merely an AGI as well. And not linearly more, I estimate the floor between AGI → ASI is at least 1000 times the computational resources. This falls from how it requires logarithmically more compute for small improvements in utility in most benchmarks.
So 20 * 1000 = 20 megawatts. So that’s the technical reason. You need large improvements in algorithmic efficiency or much more efficient and ubiquitous computers for the “escaped ASI’ threat model to be valid.
If you find this argument “unconvincing”, please provide numerical justification. What do you assume to be actually true? If you believe an ASI needs linearly more compute, please provide a paper cite that demonstrates this on any AI benchmark.
So … while I don’t assume that such estimates need to be correct or apply to an AGI (that doesn’t exist yet) I don’t think you are making a very convincing point so far.
We’re talking about the scenario of “the ASI wouldn’t be able to afford the compute to remain in existence on stolen computers and stolen money”.
There are no 20 kilowatt personal computers in existence. Note that you cannot simply botnet them together as the activations for current neural networks require too much bandwidth between nodes for the machine to operate at useful timescales.
I am assuming an ASI needs more compute and resources than merely an AGI as well. And not linearly more, I estimate the floor between AGI → ASI is at least 1000 times the computational resources. This falls from how it requires logarithmically more compute for small improvements in utility in most benchmarks.
So 20 * 1000 = 20 megawatts. So that’s the technical reason. You need large improvements in algorithmic efficiency or much more efficient and ubiquitous computers for the “escaped ASI’ threat model to be valid.
If you find this argument “unconvincing”, please provide numerical justification. What do you assume to be actually true? If you believe an ASI needs linearly more compute, please provide a paper cite that demonstrates this on any AI benchmark.
You were the one who made that argument, not me. 🙄