What I know about it from high school and general articles on the net doesn’t satisfy. Maybe because I have critical holes in my knowledge.
From what I think I know: we’re having AC running in the lines. AC means if we zoom down, we’ll see that an electron is zipping along this direction, and after 1⁄50 sec (or 1/100?) that very electron will zip back in the opposite direction, ideally back to the specific point we’re looking at, because phases are supposed to be equal.
So how does resistance come into the picture at atomic scale? Conductors heat up after a while, so maybe that’s because some of the electrons’ kinetic energy gets transferred into the wire’s temperature? Does this mean the electron slows down? But then does that mean electricity will somehow, sometime propagate slower than light?
Most if not all of our devices actually use DC, using relay(s) to get it from AC. From the only type of relay I was explained, the DC current the device receives seems to be on & off. This moment the electrons are moving forward, the relay allows them to flow into the device. 1/50s later electrons moves “backward” and it cuts the circuit so they can’t flow back and the device doesn’t have to lose electrons that way (but it doesn’t gain anything either, thus my ‘on & off’ understanding). So my question is: is it detrimental to the device? Is it responsible for the flickering of lights & other stuffs? If so, is the number of 50Hz chosen for the main purpose of making that flickering imperceptible to us?
This lead to another big pondering. Why the fuck don’t they just use DC from the source? There are some methods to transfer DC along big distances, they seem to be tried and probably true. Or the reason is simply because of inertia? That people are so used to AC and the systems for AC are all over the place, so switching is not cost-effective? Is there research on this very subject yet?
Last but not least, I wonder how exactly devices “consume” electricity. Like, is it that many electrons enter the device but fewer exit? If not so, how do counters count our consumption?
The speed at which electrical signals propagate is much faster than the speed at which electrons move in an electrical conductor. (Possibly helpful metaphor: suppose I take a broomstick and poke you with it. You feel the poke very soon after I start shoving the stick, even though the stick is moving slowly. You don’t need to wait until the very same bit of wood I shoved reaches your body.)
The speed at which electrical signals propagate is slower than the speed of light, but it’s a substantial fraction of the speed of light and it doesn’t depend on the speed at which the electrons move. (It may correlate with it—e.g., both may be a consequence of how the electrons interact with the atoms in the conductor. Understanding this right is one of the quantum-mechanical subtleties I mention below.)
When current flows through a conductor with some resistance, some of the energy in the flow of the electrons gets turned into random-ish motion in the material, i.e., heat. This will indeed make the electrons move more slowly but (see above) this doesn’t make much difference to the speed at which electrical effects propagate through the conductor.
(What actually happens in electrical conductors is more complicated than individual electrons moving around, and understanding it well involves quantum-mechanical subtleties, of most of which I know nothing to speak of.)
It is not usual to convert AC to DC using relays.
It is true that if you take AC power, rectify it using the simplest possible circuit, and use that to supply a DC device then it will alternate between being powered and not being powered—and also that during the “powered” periods the voltage it gets will vary. Some devices can work fine that way, some not so fine.
In practice, AC-to-DC conversion doesn’t use the simplest possible circuit. It’s possible to smooth things out a lot so that the device being powered gets something close to a constant DC supply.
But there are similar effects even when no rectification is being done. You mentioned flickering lights, and until recently they were an example of this. If you power an incandescent bulb using AC at 50Hz then the amount of current flowing in it varies and accordingly so does the light output. (At 100Hz, not 50Hz; figuring out why is left as an exercise for the reader.) However, because it takes time for the filament to heat up and cool down the actual fluctuation in light output is small. Fluorescent bulbs respond much faster and do flicker, and some people find their light very unpleasant for exactly that reason. LED lights, increasingly often used where incandescents and fluorescents used to be, are DC devices. I think there’s a wide variety in the circuitry used to power them, but most will flicker at some rate. Good ones will be driven in such a way that they flicker so fast you will never notice it. (Somewhere in the kHz range.)
Sometimes DC (at high voltages) is used for power transmission. I think AC is used, where it is used, because conversion between (typically very high) transmission voltage and the much lower voltages convenient for actual use is easy by means of transformers; transformers only work for AC. (Because they depend on electromagnetic induction, which works on the principle that changes in current produce magnetic fields and changes in magnetic field produce currents.) I don’t know whether AC or DC would be a better choice if we were starting from scratch now, but both systems were proposed and tried very early in the history of electrical power generation and I’m pretty sure all the obvious arguments on both sides were aired right from the start.
When a device “consumes” electrical energy it isn’t absorbing electrons. (In that case it would have to accumulate a large electrical charge. That’s usually a Bad Thing.) It’s absorbing (or using in some other way) energy carried in the electric field. It might help to imagine a system that transmits energy hydraulically instead, with every household equipped with high-pressure pipes, with a constant flow of water maintained by the water-power company, and operating its equipment using turbines. These wouldn’t consume water unless there were a leak; instead they would take in fast-moving water and return slower-moving water to the system. An “AC” hydraulic system would have water moving to and fro in the pipes; again, the water wouldn’t be consumed, but energy would be transferred from the water-pipes to the devices being operated. Powering things with electricity is similar.
DC wasn’t really a viable option at the start because of the transformer issue you mentioned. The local power lines carry ~100x higher voltage than what you get in your house, and the long distance power lines up to another 100x on top of that. Without that voltage step up, you’d need 100-10,000x as much wire.
Modern semi conductors change the game considerably though. In a lot of areas, the big iron transformers are getting phased out and replaced with switching power supplies, which suggests that it could be economically efficient now, if not for the requirement for a 50 or 60hz sinewave and existing stuff.
A DC based system would have advantages of not requiring rectification on many end uses, give some minor improvement in corona losses in transmission, and would allow for variable speed generators. It would come at the cost of controller-less induction motors and clocks that use the AC signal to keep time. I’m not sure about the cost of doing the voltage step-up/step-down because both methods are still in use. I’m not sure which would be the better choice now, but it is an interesting question.
Thank you. Using the water pipe analogy, 1 can see some obvious flaws with AC system. What if something needs power right at the moment the water is in the middle state between to & fro, i.e. standstill? How about installing a converter device at the beginning of each household? Surely it’d be better to provide continuous flow to devices, not to mention there’s no need to manufacture trillions of small relays or rectifiers that are needed inside devices.
If what devices do is get fast water and release slow water, then it can be understood that in reality, they use the kinetic energy of electrons. Or maybe some devices make use of the magnetic field too? Can somebody detail how exactly a fan gets a flow of electrons and ends up moving its rotor? And how does a computer use electricity? A fridge?
The water (or, rather, the electricity) sloshes to and fro 50 times a second, so there’s never enough delay between flicking the switch and getting usable power that a human being would notice. Typically other things are slower; e.g., if you’re turning on an incandescent lightbulb then it may take longer than that for the filament to get hot enough that it starts glowing. For many devices (e.g., your phone) there is a converter device, and when you attach your phone to its USB wall-plug it’s getting DC electricity from it.
It would be possible to have some sort of converter for every household, but every such converter has some losses, and many devices are perfectly happy just running off AC, and ones that aren’t don’t necessarily all want the same operating voltage. Again, if we were doing everything from scratch now it might be worth considering something like that (or it might not; the details matter and I’m not an electrical engineer myself), but we have a basically-working system and replacing it wholesale with something new would need to be a big improvement to be worth the tremendous cost and inconvenience.
It would be more accurate to say that devices use the energy in the electromagnetic field rather than the kinetic energy of electrons, as such. (There isn’t a clear distinction between using the electric field and using the magnetic field; the two are very intimately linked and, e.g., if two observers are moving rapidly relative to one another, then what one sees as the electric field the other may see as the magnetic field.)
The motor in an electric fan works something like this. (Unfortunately it involves effects that don’t have a close analogue in terms of flowing water.) There are coils of wire. You pass an alternating current through these coils; changing currents generate a magnetic field. (This isn’t meant to be obvious. It was one of the big discoveries of 19th-century physics.) There’s a lump of iron placed so that this magnetic field pulls on it. A bit of engineering ingenuity lets you arrange these elements so that the effect is to make a shaft keep turning in a consistent direction. You mount your fan blades on that shaft. (Don’t take my description too literally. An actual design might e.g. have the wires on the shaft and the big lumps of iron on the outside, not moving.) In terms of individual electrons: a moving electron produces a magnetic field that “curls around” its path; a whole lot of electrons moving along a conductor produce a magnetic field that curls around those conductors; if you wind that conductor into a coil, you get a magnetic field running along the length of the coil.
The details of how energy flows from place to place in all this are subtle and I will probably get them wrong if I try to go into details. As an example: suppose you supply electricity to some system by means of a pair of parallel wires with opposite currents flowing in them; then the energy flow in the system happens outside the wires, not inside them. (It happens near to the wires, and the energy flows parallel to the wires.)
(Just to reiterate: this isn’t a matter of electrons flowing into the device and being consumed, just as a hydraulically powered system that works by having water turn turbine blades doesn’t work by consuming the water.)
I think most of the power consumption in (the processing parts of) a computer is resistive losses—i.e., the thing where energy from the electric field gets transferred to kinetic energy in the electrons and/or atoms and heats things up. In an idealized maximally-efficient computing device, it turns out that the one thing that unavoidably costs energy is disposing of information, and some people have speculated about “reversible computing” that never erases bits or otherwise throws information away; but real computing devices are several orders of magnitude away from being limited by these considerations.
I believe a fridge uses electrical energy mostly in motors, which work in much the same way as the motor in a fan. These motors then drive other interesting systems that e.g. compress fluids and pump them around and so forth—I don’t know any of the details offhand—but electricity is not directly involved in those mechanisms.
As I hope I’ve already made clear, I’m not really an expert on this, and quite possibly no other LW regulars are either. You might do better to find e.g. a textbook on electromagnetism. (But be warned: if you read a textbook on electromagnetism that goes deep enough to answer your questions, you will end up having to do quite a lot of maths.)
I think using the water as an analogy to electricity is still somehow not adequate to the task. For example, to make it slosh back & forth would require a tremendous amount of energy, which seems not to be the case with electricity.
But still, I also think that if a device consumes electricity, no matter what way—say, using electromagnetic field, then it must reflect into the lifeline in the wire (electrons) in some way. Since the power source propagate energy using the jiggling of electrons then by using them up, the device must impede that movement. This slowing in jiggling will then propagate back and display as the slowing of the turbine...
… which is to say, actually we convert kinetic energy into whatever type of energy we use, that’s the essence of “electricity”?
BTW, thank you for your explanations on fans & stuffs! Though the bits with computers & fridges are gloss-over, but I guess I can have a vague understanding.
Yes, water and electricity are different in important ways even though the analogy is informative sometimes.
The energy in the electromagnetic field really truly is different from the kinetic energy of the electrons. (This is one of the important differences from water in a pipe, in fact.)
You can see this fairly easily in a “static” case: if I use electricity to charge up a big capacitor, I’ve stored lots of energy in the capacitor but it’s potential not kinetic energy. (There’s a lot of potential energy there because there’s extra positive charge in one place and extra negative charge in another, and energy will be released if they are allowed to move together so that the net charge everywhere becomes approximately zero.)
You might want to describe this situation by saying that the electrons involved have a certain amount of potential energy, just as you might say that when you lift a heavy object from the surface of the earth that object has acquired (gravitational) potential energy. That point of view works fine for this sort of static situation, but once your charges start moving around it turns out to be more insightful to think of the energy as located in the electromagnetic fields rather than in the particles that interact with those fields.
So, for instance, suppose you arrange for an alternating current to flow in a conductor. Then it will radiate, transmitting energy outward in the form of electromagnetic waves. (Radio waves, at the sort of frequencies you can readily generate in a wire. At much higher frequencies you get e.g. light waves, but you typically need different hardware for that.) This energy is carried by the electromagnetic field. It will propagate just fine in a vacuum: no need for any charged particles in between.
When you have an actual electrical circuit, things are more complicated, but it turns out that the electrical energy is not flowing through the wires, it’s flowing through the field around the wires. And, again, this energy is not the kinetic energy of the electrons.
I still have not achieved a breakthrough. See, when we broadcast a wave, say radio, then it will propagate into space and will be lost forever. Now as per your words, an AC flow in a wire will radiate energy outward ⇒ this means a lot of energy is lost all the time. Since the wattage in a wire is a constant, we lose a big and constant amount of energy no matter what we do. That seems not to be the case in real life.
Furthermore, if we accept that electrical energy actually flows in the field around the line, then why do we even need outlets and sockets? Just put a device near the wire, like those cordless chargers. Besides, electric thieves can be easy since almost everyone can put a specialized stealing device near a public line.
It’s not a big amount. (For, e.g., a typical mains cable.) And cabling, especially if the currents flowing in it are at high frequencies (which means more radiation), is often designed to reduce that radiation. That’s one reason why we have coaxial cables and twisted pairs. For a 50Hz or 60Hz power cable, though, the radiative losses are tiny.
You can power devices wirelessly—using “those cordless chargers”. They are designed to maximize the extent to which these effects happen, and of course the devices need to be designed to work that way. Ordinary mains cables don’t radiate a lot and it isn’t practical to power anything nontrivial by putting it near a mains cable.
But the most effective way of getting energy from the field around a pair of wires is … to connect the wires into an electric circuit. Indeed, it’s only when they’re connected in such a circuit that the current will flow through the wires and the energy will flow around them.
I recall seeing something about a very low-powered (and cheaply made) LED lightbulb which could never be turned off. With the light switch on, it was bright, and with the light switch off it was much more dim, but not actually off. It turned out this was because in certain common house wiring configurations, electrical field effects between nearby wires allow enough power through to light the bulb
https://www.youtube.com/watch?v=1uEmX5XClPY
Yeah, I did have that experience too. But come to think of it, his explanation in the video sounds counter-intuitive for AC & DC. With the bulb connected to the mains via a wire (even though it’s the neutral line and that line is severed) like in the better part of the video, as long as the mains is AC the bulb will always at least dim...
TBH I’m a bit more confused :)
Perhaps you already know this, but some of your statements made me think you don’t. In an electric circuit, individual electrons do not move from the start to the end at the speed of light. Instead, they move much more slowly. This is true regardless of whether the current is AC or DC.
The thing that travels at the speed of light is the *information* that a push has happened. There’s an analogy to a tube of ping-pong balls, where pushing on one end will cause the ball at the other end to move very soon, even though no individual ball is moving very quickly.
http://wiki.c2.com/?SpeedOfElectrons
Caveat that I have no formal training in physics.
Ooh, indeed I didn’t know, thanks! The actual snail speed does surprise me. I guess an important hole has been patched.
I think Bill Beaty’s page on electricity might be what you’re looking for. Here’s a joking teaser which shows the kinds of questions he’s trying to answer:
And then he goes on to answer all the questions one by one, in a very straightforward way.
Holy cow, I’ve just read to the “poynty” part in his work. Now I have a vague sense of why Tesla wanted to put wireless electricity down into every household. And even Feynmann was afraid of explaining the truth because of its complexity/difficulty.
The electrons in a current never move anything close to the speed of light (https://en.wikipedia.org/wiki/Drift_velocity). It is the propagation of the changes in the electric field caused by the electrons moving that moves at the speed of light. It is more like a tube full of marbles (a stretched analogy). If you push the marble on one end the marble at the other end moves almost instantly. The marble you pushed didn’t move all that distance.
Yes, the heat in conductors is caused by the electrons kinetic energy. No, it doesn’t really change the propagation speed of the current since that is the electric field propagating. There is certainly power lost there.
It is not easy to transmit DC over long distances (https://en.wikipedia.org/wiki/War_of_the_currents). Edison tried hard to push the adoption of DC going so far as to publicly electrocute elephants with high voltage AC as a PR stunt to scare people. You can find videos of this online if you want. It didn’t work because it just so much more efficient to transmit AC voltage and use a transformers to step it down.
The wiki Currents war article ends with a brief mention of HVDC. China utilizes it in 2019, and they certainly are not stupid, so...
The HVDC article lists some pros & cons of it over AC. At a quick glance, there are more pros. And what of the biggest disadvantage? Converter stations cost. And what do they do? They convert that DC into AC, so it can be distributed into households and then switched back to DC inside the devices so they can use electricity! All of this clusterfuck nonsense can be avoided if they use all-out DC system in the 1st place!
I guess using a war more than 120 years ago to justify current (pun intended) situation is not very good.
I can give some partial answers based on my own models:
AC is used for transmission because transformers are ubiquitous and incredibly valuable at all stages of transmission, and transformers work using AC (you need a changing electrical field to generate a changing magnetic field). Transformers allow you to convert the voltage and isolate circuits. Isolation is important for safety, and voltage conversion is important to achieve the cross purposes of safety and efficiency. High voltage allows you to transfer more energy with fewer losses, but is far more dangerous to work with. This gets to your resistance question—resistance / heat generation are related to the amount of current and the thickness of the material. To transfer a given amount of energy, higher voltage means less current needed for the same wire, which means less heat losses.
Why 50Hz (or 60 in the US)? As far as I know, this is largely arbitrary. I do know that subtle differences in the frequency are used for signaling grid load. https://en.wikipedia.org/wiki/Utility_frequency has a lot of info though!
As for metering, I have no idea how current meters (ammeters/watt meters) work, but I am pretty sure no net electrons are entering or leaving e.g. your house or your appliance. Electrons in a circuit should be conserved, they’re just the means of transfer of energy.
Tks. You mentioned isolation is important for safety. Can you elaborate some specific examples? As per my imagination, unless the threat has been predicted then the AC transformers are useless against sudden issues. Say, an abrupt surge will still propagate via its magnetic field before we can do anything.
Isolation is not about surges, but about preventing current from flowing in a particular path at all. In a transformer, there is no conductive (only magnetic) path from the input side to the output side. So, if you touch one or more of the low-voltage output terminals of a transformer, you can’t thereby end up part of a high-voltage circuit no matter what else you’re also touching; only experience the low voltage. This is how wall-plug low voltage power supplies work. Even the ones that are using electronic switching converters (nearly all of them today) are using a transformer to provide the isolation: the line voltage AC is converted to higher frequency AC, run through a small transformer (the higher the frequency, the smaller a transformer you need for the same power) and converted back to DC.
Oh, I was too focused on the system function while forgetting that safety can primarily apply to human health too :)
Why 50/60Hz? It has to be too low to be heard, to high to be seen, high enough for transformation, low enough for low induction losses, low enough for simple rotating machines. Trains can not use 50⁄60 so they went with 1⁄3 (16+2/3 Hz or 20 Hz)
Grid frequency is controlled to +-150mHz if that fails private customers might get disconnected/dropped.
The time derivative of the grid frequency is a measure of the relative power mismatch.
50-60Hz is not too low to be heard: https://www.youtube.com/watch?v=bslHKEh7oZk
It’s not really too high to be seen either, lights that flicker at mains frequency can be pretty unpleasant on the eyes, and give some people headaches.
True, I had not claimed that all criteria could or have been met. Because of the noise and the heat I just the other day replaced the inductive load in some of my very old but still fully functioning kitchen counter lights, with modern switching current regulators. The 50 Hz produce a 100 Hz tone that had been bothering me for decades. But even some of those can be heard by some people. (Not me I am deaf to anything >10kHz)
It is a compromise in an area of sensory overlap but the human senses are not equally sensitive to all frequencies. Your hearing is way better at 3kHz. At your age you will still remember CRT monitors that would operate at 60 Hz at max resolution, bad but they did get used.