I made a similar point (but without specific numbers—great to have them!) in a comment https://www.lesswrong.com/posts/Lwy7XKsDEEkjskZ77/?commentId=nQYirfRzhpgdfF775 on a post that posited human brain energy efficiency over AIs as a core anti-doom argument, and I also think that the energy efficiency comparisons are not particularly relevant either way:
Humanity is generating and consuming enormous amount of power—why is the power budget even relevant? And even if it was, energy for running brains ultimately comes from Sun—if you include the agriculture energy chain, and “grade” the energy efficiency of brains by the amount of solar energy it ultimately takes to power a brain, AI definitely has a potential to be more efficient. And even if a single human brain is fairly efficient, the human civilization is clearly not. With AI, you can quickly scale up the amount of compute you use, but scaling beyond a single brain is very inefficient.
I made a similar point (but without specific numbers—great to have them!) in a comment https://www.lesswrong.com/posts/Lwy7XKsDEEkjskZ77/?commentId=nQYirfRzhpgdfF775 on a post that posited human brain energy efficiency over AIs as a core anti-doom argument, and I also think that the energy efficiency comparisons are not particularly relevant either way:
Humanity is generating and consuming enormous amount of power—why is the power budget even relevant? And even if it was, energy for running brains ultimately comes from Sun—if you include the agriculture energy chain, and “grade” the energy efficiency of brains by the amount of solar energy it ultimately takes to power a brain, AI definitely has a potential to be more efficient. And even if a single human brain is fairly efficient, the human civilization is clearly not. With AI, you can quickly scale up the amount of compute you use, but scaling beyond a single brain is very inefficient.