Yes, water and electricity are different in important ways even though the analogy is informative sometimes.
The energy in the electromagnetic field really truly is different from the kinetic energy of the electrons. (This is one of the important differences from water in a pipe, in fact.)
You can see this fairly easily in a “static” case: if I use electricity to charge up a big capacitor, I’ve stored lots of energy in the capacitor but it’s potential not kinetic energy. (There’s a lot of potential energy there because there’s extra positive charge in one place and extra negative charge in another, and energy will be released if they are allowed to move together so that the net charge everywhere becomes approximately zero.)
You might want to describe this situation by saying that the electrons involved have a certain amount of potential energy, just as you might say that when you lift a heavy object from the surface of the earth that object has acquired (gravitational) potential energy. That point of view works fine for this sort of static situation, but once your charges start moving around it turns out to be more insightful to think of the energy as located in the electromagnetic fields rather than in the particles that interact with those fields.
So, for instance, suppose you arrange for an alternating current to flow in a conductor. Then it will radiate, transmitting energy outward in the form of electromagnetic waves. (Radio waves, at the sort of frequencies you can readily generate in a wire. At much higher frequencies you get e.g. light waves, but you typically need different hardware for that.) This energy is carried by the electromagnetic field. It will propagate just fine in a vacuum: no need for any charged particles in between.
When you have an actual electrical circuit, things are more complicated, but it turns out that the electrical energy is not flowing through the wires, it’s flowing through the field around the wires. And, again, this energy is not the kinetic energy of the electrons.
I still have not achieved a breakthrough. See, when we broadcast a wave, say radio, then it will propagate into space and will be lost forever. Now as per your words, an AC flow in a wire will radiate energy outward ⇒ this means a lot of energy is lost all the time. Since the wattage in a wire is a constant, we lose a big and constant amount of energy no matter what we do. That seems not to be the case in real life.
Furthermore, if we accept that electrical energy actually flows in the field around the line, then why do we even need outlets and sockets? Just put a device near the wire, like those cordless chargers. Besides, electric thieves can be easy since almost everyone can put a specialized stealing device near a public line.
It’s not a big amount. (For, e.g., a typical mains cable.) And cabling, especially if the currents flowing in it are at high frequencies (which means more radiation), is often designed to reduce that radiation. That’s one reason why we have coaxial cables and twisted pairs. For a 50Hz or 60Hz power cable, though, the radiative losses are tiny.
You can power devices wirelessly—using “those cordless chargers”. They are designed to maximize the extent to which these effects happen, and of course the devices need to be designed to work that way. Ordinary mains cables don’t radiate a lot and it isn’t practical to power anything nontrivial by putting it near a mains cable.
But the most effective way of getting energy from the field around a pair of wires is … to connect the wires into an electric circuit. Indeed, it’s only when they’re connected in such a circuit that the current will flow through the wires and the energy will flow around them.
I recall seeing something about a very low-powered (and cheaply made) LED lightbulb which could never be turned off. With the light switch on, it was bright, and with the light switch off it was much more dim, but not actually off. It turned out this was because in certain common house wiring configurations, electrical field effects between nearby wires allow enough power through to light the bulb
Yeah, I did have that experience too. But come to think of it, his explanation in the video sounds counter-intuitive for AC & DC. With the bulb connected to the mains via a wire (even though it’s the neutral line and that line is severed) like in the better part of the video, as long as the mains is AC the bulb will always at least dim...
Yes, water and electricity are different in important ways even though the analogy is informative sometimes.
The energy in the electromagnetic field really truly is different from the kinetic energy of the electrons. (This is one of the important differences from water in a pipe, in fact.)
You can see this fairly easily in a “static” case: if I use electricity to charge up a big capacitor, I’ve stored lots of energy in the capacitor but it’s potential not kinetic energy. (There’s a lot of potential energy there because there’s extra positive charge in one place and extra negative charge in another, and energy will be released if they are allowed to move together so that the net charge everywhere becomes approximately zero.)
You might want to describe this situation by saying that the electrons involved have a certain amount of potential energy, just as you might say that when you lift a heavy object from the surface of the earth that object has acquired (gravitational) potential energy. That point of view works fine for this sort of static situation, but once your charges start moving around it turns out to be more insightful to think of the energy as located in the electromagnetic fields rather than in the particles that interact with those fields.
So, for instance, suppose you arrange for an alternating current to flow in a conductor. Then it will radiate, transmitting energy outward in the form of electromagnetic waves. (Radio waves, at the sort of frequencies you can readily generate in a wire. At much higher frequencies you get e.g. light waves, but you typically need different hardware for that.) This energy is carried by the electromagnetic field. It will propagate just fine in a vacuum: no need for any charged particles in between.
When you have an actual electrical circuit, things are more complicated, but it turns out that the electrical energy is not flowing through the wires, it’s flowing through the field around the wires. And, again, this energy is not the kinetic energy of the electrons.
I still have not achieved a breakthrough. See, when we broadcast a wave, say radio, then it will propagate into space and will be lost forever. Now as per your words, an AC flow in a wire will radiate energy outward ⇒ this means a lot of energy is lost all the time. Since the wattage in a wire is a constant, we lose a big and constant amount of energy no matter what we do. That seems not to be the case in real life.
Furthermore, if we accept that electrical energy actually flows in the field around the line, then why do we even need outlets and sockets? Just put a device near the wire, like those cordless chargers. Besides, electric thieves can be easy since almost everyone can put a specialized stealing device near a public line.
It’s not a big amount. (For, e.g., a typical mains cable.) And cabling, especially if the currents flowing in it are at high frequencies (which means more radiation), is often designed to reduce that radiation. That’s one reason why we have coaxial cables and twisted pairs. For a 50Hz or 60Hz power cable, though, the radiative losses are tiny.
You can power devices wirelessly—using “those cordless chargers”. They are designed to maximize the extent to which these effects happen, and of course the devices need to be designed to work that way. Ordinary mains cables don’t radiate a lot and it isn’t practical to power anything nontrivial by putting it near a mains cable.
But the most effective way of getting energy from the field around a pair of wires is … to connect the wires into an electric circuit. Indeed, it’s only when they’re connected in such a circuit that the current will flow through the wires and the energy will flow around them.
I recall seeing something about a very low-powered (and cheaply made) LED lightbulb which could never be turned off. With the light switch on, it was bright, and with the light switch off it was much more dim, but not actually off. It turned out this was because in certain common house wiring configurations, electrical field effects between nearby wires allow enough power through to light the bulb
https://www.youtube.com/watch?v=1uEmX5XClPY
Yeah, I did have that experience too. But come to think of it, his explanation in the video sounds counter-intuitive for AC & DC. With the bulb connected to the mains via a wire (even though it’s the neutral line and that line is severed) like in the better part of the video, as long as the mains is AC the bulb will always at least dim...
TBH I’m a bit more confused :)