A Brief History Of AC/DC Electricity

July 27, 2016 | 5 Minute Read

In the world of electricity the question is still asked; What is the difference between alternating current (AC) and direct current (DC)? Here, Geraint Thomas looks at the history.

Let’s start off answering this question with a bit of history about how electricity was discovered. How did we find this invisible form of energy? Perhaps the most significant discovery was in the 1600’s with a man named William Gilbert, who served as a physician for Elizabeth the first of England. William discovered through experiments of rubbing amber and jet (black amber) together that this attracted particles to stick to them, and so magnetism was found.

Though this is a very simple form of electricity it was the start for other inventors and physicians to expand on the theory. With the help of the German inventor Otto von Guericke, who discovered that vacuums can be formed, Gilbert was able to produce a machine that created static energy; the first ever electric generator. So this is where the first DC electricity supply was discovered.

In 1799, Joseph Henry was born in Albany, New York. Throughout his career he was employed as assistant engineer of a survey team, later moving from this field to study civil and mechanical engineering. With the change in his studies, Henry started to work with magnets. He found out when wire was wound in coils and excited by magnets and batteries, there was a change to the electrical properties.

From these studies Henry, in partnership with Michael Faraday, discovered the principles behind the dynamo and an electric generator was devised, allowing the subsequent development of the electrical motor.

It was however Thomas Edison who took these theories to the next level by creating the largest ever dynamo, known as the Edison Dynamo, in 1881. Following Edison, Europe and America built the first plants for electrical distribution to support the use of electricity to power lighting. With the dynamo still creating DC electricity, however, there was a need for the electricity to be moved further around the world. There was a great need to find a way of doing this without the power losses associated with moving DC along great distances. So from all the work that had been done by Joseph Henry and Michael Faraday the alternating transformer was created.

In the late 19th century, Thomas Edison, Nikola Tesla and George Westinghouse went head to head over which electricity system, DC or AC, would become standard. This was dubbed the “war of the currents”; Edison pushed for the DC system, in which electrical current flows steadily in one direction, while Tesla and Westinghouse wanted AC to be the world’s main source of power.

Thus AC electricity was born, allowing electricity to travel greater distances to feed the growing world we see today.

Now we know how the two forms of electricity were discovered it will allow us to answer the question of what the difference is between AC and DC electricity?

The key main differences are in the names ‘alternating’ current and ‘direct’ current. DC electricity is electricity flowing in a constant direction with a voltage with constant polarity (the term used to describe the current flow in an electrical circuit). AC electricity is voltages alternating in polarity in a natural way reversing positively and negatively over time.

The way that electricity is produced has key differences too, due to the way generators produce the electricity. AC electricity can be generated relatively easily, compared to DC. AC generators are made up of a magnetic field rotating around fixed coils; when it turns, an AC voltage will be produced as the magnet passes the wire coils.

While DC generators also work on the principle of electromagnetic induction, the way they are put together is a lot different from AC generation. In DC generators, the coil of wire is part of the shaft, and electrical connections are made to this spinning coil via stationary carbon “brushes” contacting copper strips on the rotating shaft. All this is necessary to switch the coil’s changing output polarity so the external circuit sees a constant polarity.

In practice AC is a lot easier to produce, and it has become the more common way of producing the electricity today. This is also linked to what we found out about the history of electricity: that DC does not have the same ability as AC to travel distances due to power loss. This is why we use AC in our houses and factories.

DC can be transported large distances but the cost of the infrastructure and size of the cable make it too costly to use as the common way of getting power to those who need it.  But even today there are many industries that still use DC, such as steel foundries. This is due to the type of application and the prohibitive cost of changing all equipment to AC.

Another key difference between AC and DC is that DC power is used in low voltage applications such as for charging batteries. AC supply cannot be used to do this because alternating currents switch polarity up to 50 or 60 times a second (frequency of AC is 50 or 60 Hz). Also, an AC supply has a positive half cycle of the sine wave and a negative half cycle. This means that the positive cycle will be canceled out by the negative cycle so the voltage and currents in the cycle will average out at zero, not allowing any charge to be stored.

From looking into the two different types of electrical supply, you can see they both have their advantages and disadvantages in the modern world.

Browse all drive basics blog posts
This website uses cookies to ensure you get the best experience on our website. Learn More