Having now actually done the math, I realize that there is an optimum resistance for the heating element, with both larger and smaller resistances producing less heat. If V is the voltage, R0 the resistance in the cord, and R1 the resistance in the heating element, the heating power is (V/(R0+R1))^2 R1, which is maximized when R1=R0. However, half the energy is then wasted, so I assume engineers aim for a higher R1/R0 ratio. If so, reducing R1 by descaling to keep the temperature low will indeed result in more heating power.
But unlike what I said in my comment, this will make energy efficiency worse, not better—the ratio of power used to wasted is just R1/R0, so making R1 bigger and bigger will improve efficiency, at the cost of taking longer to head. Though actually, this is true only for a perfectly insulated kettle. If the kettle looses heat, at some high R1 value it will never boil at all (ie, it will waste an infinite amount of energy).
If you assume that the engineer designing the kettle made the optimal tradeoffs for all these factors when choosing R1, then descaling will be good because it will bring R1 back to what the engineer thought was the best value. But how much this actually matters is harder to say.
Hm, doesn’t this assume that none of the power going into the kettle is wasted? But some of it is—after I pour water, the heating element is still above room temperature.
Yes, but losses like that are presumably independent of everything else, assuming you pour immediately after the kettle boils (though that maybe is less likely with long boil time, which might lead you to be less attentive to immediately acting when it finally boils).
OK, you’re right. Put another way, if the heating element doesn’t get as hot, there is less energy wasted when it cools down after you pour the water out.
My comment wasn’t entirely correct.
Having now actually done the math, I realize that there is an optimum resistance for the heating element, with both larger and smaller resistances producing less heat. If V is the voltage, R0 the resistance in the cord, and R1 the resistance in the heating element, the heating power is (V/(R0+R1))^2 R1, which is maximized when R1=R0. However, half the energy is then wasted, so I assume engineers aim for a higher R1/R0 ratio. If so, reducing R1 by descaling to keep the temperature low will indeed result in more heating power.
But unlike what I said in my comment, this will make energy efficiency worse, not better—the ratio of power used to wasted is just R1/R0, so making R1 bigger and bigger will improve efficiency, at the cost of taking longer to head. Though actually, this is true only for a perfectly insulated kettle. If the kettle looses heat, at some high R1 value it will never boil at all (ie, it will waste an infinite amount of energy).
If you assume that the engineer designing the kettle made the optimal tradeoffs for all these factors when choosing R1, then descaling will be good because it will bring R1 back to what the engineer thought was the best value. But how much this actually matters is harder to say.
Hm, doesn’t this assume that none of the power going into the kettle is wasted? But some of it is—after I pour water, the heating element is still above room temperature.
Yes, but losses like that are presumably independent of everything else, assuming you pour immediately after the kettle boils (though that maybe is less likely with long boil time, which might lead you to be less attentive to immediately acting when it finally boils).
I don’t think so—if descaling reduces the temperature of the heating element (which I think we agree on), it’ll reduce the power wasted heating it up.
OK, you’re right. Put another way, if the heating element doesn’t get as hot, there is less energy wasted when it cools down after you pour the water out.