Ohm's law
Based on Wikipedia: Ohm's law
The Professor Who Was Called Unworthy to Teach Science
In 1827, a German physics teacher named Georg Ohm published a book with a straightforward claim: the electric current flowing through a wire is directly proportional to the voltage pushing it. Double the voltage, double the current. Halve the voltage, halve the current. Simple.
His colleagues called it "a web of naked fancies."
The German Minister of Education declared that "a professor who preached such heresies was unworthy to teach science." Ohm's career stalled. His work was ridiculed. The prevailing philosophy in German academia at the time held that nature was so perfectly ordered that scientists didn't actually need to perform experiments—truth could be discovered through pure reasoning alone. Actually measuring things? How vulgar.
It took nearly two decades for the scientific establishment to accept what we now consider almost embarrassingly obvious. Today, Ohm's Law is typically the first equation anyone learns about electricity. It's so fundamental that we named the unit of electrical resistance after Ohm himself. But his story reminds us that even the simplest truths can face fierce resistance—no pun intended—when they challenge how people think the world should work.
The Equation That Runs the Modern World
Let's talk about what Ohm actually discovered, because it's more elegant than you might expect.
Imagine you're pushing water through a pipe. The harder you push (more pressure), the more water flows. But if the pipe is narrow or clogged, less water gets through even with the same pressure. Electrical circuits work the same way.
Voltage is the electrical pressure—it's the force pushing electrons through a wire. Current is the flow of those electrons, measured in amperes (often shortened to amps). Resistance is anything that impedes that flow, measured in ohms.
Ohm's Law ties these three quantities together in the simplest possible relationship: voltage equals current multiplied by resistance. Written mathematically, that's V = IR. If you know any two of these values, you can calculate the third. Want to know the current? Divide voltage by resistance. Need to find resistance? Divide voltage by current.
This equation underlies virtually every electronic device you've ever used. When engineers design a smartphone, they're constantly applying Ohm's Law to ensure the right amount of current reaches each component. Too much current and components fry. Too little and they won't function. The battery voltage is fixed, so designers carefully choose resistors to control how current flows through the circuit.
Measuring Current with Your Own Body
Ohm wasn't actually the first person to notice this relationship. Nearly half a century earlier, in 1781, a wealthy English scientist named Henry Cavendish conducted his own experiments with electricity. His laboratory equipment? Glass tubes filled with salt water. His measuring instrument? Himself.
Cavendish would complete electrical circuits by grabbing the ends with his bare hands and noting how strong a shock he felt. A stronger shock meant more current was flowing. Through this decidedly uncomfortable method, he observed that current varied directly with voltage—exactly what Ohm would later describe with mathematical precision.
But Cavendish was notoriously secretive. He never published his results. He barely spoke to anyone. His discoveries sat forgotten in his private notes until James Clerk Maxwell unearthed them nearly a century later, in 1879. By then, Ohm had already received credit for the law that Cavendish had observed first.
Science is full of such near-misses. If Cavendish had simply written a letter to a colleague or presented his findings at a scientific society meeting, we might be talking about Cavendish's Law today.
How Ohm Actually Did It
Ohm's experimental setup was considerably more sophisticated than Cavendish's self-electrocution approach.
He initially used voltaic piles—early batteries made of stacked metal discs separated by cardboard soaked in saltwater. But these proved unreliable. The voltage would drift as the chemicals inside depleted. So Ohm switched to a thermocouple, a device that generates voltage from temperature differences. By keeping the temperature difference constant, he could maintain a stable, predictable voltage source.
To measure current, Ohm used a galvanometer—essentially a magnetic compass that deflects in proportion to the current flowing through a wire wrapped around it. The more current, the greater the deflection. This gave him quantitative measurements rather than subjective sensations.
Then came the clever part. Ohm would add test wires of varying lengths, thicknesses, and materials to his circuit. Each change affected how much current flowed. By systematically varying these parameters and recording the galvanometer readings, he built up enough data to spot the underlying pattern.
What he found was that current depended on the total resistance in the circuit—both the internal resistance of his thermocouple and the resistance of whatever test wire he added. Longer wires meant more resistance. Thicker wires meant less. Different materials had different inherent resistivities. Once he accounted for all these factors, the relationship between voltage and current emerged clearly: they were directly proportional, with resistance as the constant of proportionality.
Why Electrons Actually Obey This Law
For about seventy years after Ohm published his work, nobody knew why his law worked. They knew it was accurate—telegraph engineers relied on it to design communication systems spanning continents—but the underlying mechanism remained mysterious.
Then, in 1897, a British physicist named Joseph John Thomson discovered the electron. Within three years, a German physicist named Paul Drude proposed a model to explain electrical conduction.
Picture a solid metal as a lattice of atoms arranged in a regular pattern, like oranges stacked at a grocery store. These atoms are locked in place. But the electrons? They're free to roam, bouncing around like pinballs in a machine.
When you apply a voltage across the metal, you create an electric field that pushes on the electrons. They start drifting in one direction—this drift is what we call electric current. But the electrons don't travel in straight lines. They constantly collide with the lattice atoms, scattering in random directions.
Here's the key insight: even though individual electrons follow chaotic, zigzagging paths, their average drift velocity ends up proportional to the applied electric field. More voltage means a stronger field, which means faster average drift, which means more current. The relationship is linear—exactly what Ohm's Law predicts.
Those collisions with atoms do something else too: they convert the electrons' kinetic energy into heat. This is why wires get warm when current flows through them, and why high-resistance elements like the filament in an incandescent light bulb glow white-hot.
The Quantum Refinements
Drude's model was remarkably successful, but it had problems. Some of its predictions about heat capacity and other properties were simply wrong. The model treated electrons like classical particles—tiny billiard balls obeying Newton's laws. But electrons, it turned out, are quantum objects with wave-like properties.
In 1927, Arnold Sommerfeld applied quantum mechanics to the problem. He replaced Drude's classical statistics with the quantum Fermi-Dirac distribution, which accounts for the fact that electrons are fermions—particles that obey the Pauli exclusion principle and can't occupy the same quantum state simultaneously. This "free electron model" fixed many of Drude's errors while preserving the successful prediction of Ohm's Law.
A year later, Felix Bloch made another crucial discovery. Electrons in a crystal don't actually bounce off the regularly-arranged lattice atoms the way Drude imagined. Instead, electrons move through the lattice as waves, and a perfect crystal lattice doesn't scatter them at all. What does scatter electrons are impurities and defects—places where the regular pattern is disrupted.
This explained something puzzling: why ultra-pure metals at low temperatures have incredibly low resistance. Remove the impurities and cool everything down (reducing thermal vibrations that also disrupt the lattice), and electrons flow with almost no impediment.
The modern quantum band theory of solids added another layer of understanding. Electrons in a material can't have just any energy—they're restricted to specific energy bands, with forbidden gaps between them. The size of these gaps determines whether a material is a conductor, a semiconductor, or an insulator. In conductors, electrons can easily move between bands. In insulators, the gaps are too large for electrons to jump across. Semiconductors sit in between, which is why they're so useful for building transistors and computer chips.
Despite all these quantum refinements, Ohm's Law survives. The fundamental linear relationship between voltage and current emerges from the averaged behavior of countless electrons, whether you describe them classically or quantum mechanically.
When Ohm's Law Breaks Down
No physical law is truly universal. Ohm's Law works beautifully for most materials under ordinary conditions, but there are important exceptions.
Some materials are "non-ohmic"—their resistance changes with the applied voltage. Light bulbs, for instance. When a tungsten filament is cold, it has relatively low resistance. As current heats it up, the resistance increases dramatically. The relationship between voltage and current isn't linear; it curves.
Diodes are even more dramatic. These semiconductor devices allow current to flow easily in one direction but block it almost completely in the other. Their current-voltage relationship looks nothing like a straight line—it's exponential in one direction and nearly flat in the other. This non-ohmic behavior is precisely what makes diodes useful for converting alternating current to direct current.
Any material will eventually break down under a strong enough electric field. The electrons get ripped away from their atoms, the material becomes ionized, and suddenly you have a plasma conducting electricity through what was previously an insulator. This is what happens in a lightning bolt: air, normally an excellent insulator, becomes briefly conductive when the voltage between cloud and ground grows large enough.
There were also theoretical concerns about whether Ohm's Law would hold at very small scales. Early twentieth-century physicists wondered if the law might fail at atomic dimensions. But experiments have proved otherwise. As of 2012, researchers demonstrated that Ohm's Law works for silicon wires just four atoms wide and one atom high. The linear relationship persists down to scales that would have astonished Ohm himself.
The Noise in the Signal
In the 1920s, scientists discovered something curious. Even when you hold voltage and resistance perfectly constant, the current through a resistor fluctuates slightly. These tiny random variations have nothing to do with measurement error—they're fundamental.
This phenomenon, now called Johnson-Nyquist noise, arises from the thermal motion of electrons. At any temperature above absolute zero, atoms vibrate randomly, and so do the free electrons drifting through a conductor. These random motions add a small amount of noise to whatever current you're trying to measure.
The effect is minuscule in everyday circuits. But in sensitive electronics—radio receivers, scientific instruments, the sensors in your smartphone camera—thermal noise sets a fundamental limit on how weak a signal can be detected. Engineers designing such systems must account for it.
Importantly, Ohm's Law still holds for the average current. If you measure over a long enough time period, the random fluctuations average out and you get exactly the ratio that Ohm predicted. The noise is real, but it's noise around the true value, not evidence that the law is wrong.
The Water Pipe Analogy
If you've ever struggled to understand electrical circuits, you're not alone. Electricity is invisible, and our intuitions about it are weak. That's why teachers often use the hydraulic analogy, comparing electrical circuits to water flowing through pipes.
Voltage corresponds to water pressure. A pressure difference between two points in a pipe system causes water to flow, just as a voltage difference causes current to flow. The greater the pressure difference, the faster the water moves.
Current corresponds to the flow rate—how many liters per second pass through a given point. In electrical terms, current is how many coulombs of charge pass per second. One ampere equals one coulomb per second.
Resistance corresponds to anything that restricts flow: narrow pipes, valves, filters. A constriction in a pipe requires more pressure to push the same amount of water through. Similarly, higher electrical resistance requires more voltage to push the same current.
The analogy isn't perfect—water and electrons behave differently in important ways—but it captures the essence of Ohm's Law. The flow rate is proportional to the pressure difference and inversely proportional to the restriction. Double the pressure, double the flow. Double the restriction, halve the flow.
This hydraulic parallel has practical applications beyond teaching. Engineers studying blood flow through the circulatory system sometimes model it as an electrical circuit, using Ohm's Law and its hydraulic equivalent to predict how blood pressure and flow rate relate to the resistance of blood vessels. When an artery narrows due to plaque buildup, the resistance increases, and the heart must pump harder (generate more pressure) to maintain adequate flow.
The Generalized Forms
The simple V = IR equation works for basic circuits, but electricity in the real world is more complex. There's a more general form of Ohm's Law that applies to materials rather than circuits, and it was formulated by another German physicist, Gustav Kirchhoff.
Instead of thinking about total voltage across and current through a component, this version considers the electric field at each point inside a material and the current density—how much current flows per unit area. The relationship is similar: current density equals the electric field multiplied by the material's conductivity.
Conductivity is the inverse of resistivity. A highly conductive material like copper has high conductivity and low resistivity. A poor conductor like rubber has low conductivity and high resistivity. This formulation lets physicists and engineers analyze how current distributes itself inside complex three-dimensional structures, not just along simple wires.
For oscillating currents—alternating current, or AC, as opposed to the direct current, DC, that flows steadily in one direction—things get more complicated still. Resistance alone doesn't tell the whole story. Capacitors and inductors introduce frequency-dependent effects that Ohm never considered. But these developments don't contradict Ohm's Law; they extend it into domains he couldn't have imagined.
A Unit Named Siemens
There's one more piece of this story worth mentioning. For decades, electrical conductance—the inverse of resistance, measuring how easily current flows—was measured in units called mhos. Mho is just "ohm" spelled backwards, a bit of scientific wordplay.
In 1971, the International System of Units adopted a new name for this unit: the siemens, honoring Ernst Werner von Siemens, the German industrialist who founded the company that still bears his name. Siemens built telegraph networks across Europe and pioneered electrical engineering as a practical discipline.
The siemens is now the preferred unit in formal scientific papers, though mho persists in some engineering contexts. It's fitting that both the unit of resistance and the unit of conductance are named after German electrical pioneers—Ohm for resistance, Siemens for conductance. Two sides of the same coin, measured in units commemorating two men who helped electrify the world.
From Heresy to Orthodoxy
By the 1850s, just two decades after being called a web of naked fancies, Ohm's Law was considered proven fact. Alternatives like "Barlow's Law" were discredited when put to practical tests. Samuel Morse, inventor of the telegraph code, discussed this explicitly in 1855—Ohm's Law worked for designing telegraph systems; competing theories didn't.
Ohm himself lived to see his vindication. Before his death in 1854, he received recognition for his contributions to science, including a medal from the Royal Society in London. The same German educational establishment that had declared him unworthy eventually came around.
Today, V = IR appears in textbooks worldwide. It's taught to middle schoolers. It's fundamental to every electrical engineering curriculum. It governs the design of the computer or phone you're using to read this, the power grid delivering electricity to your home, the neural signals in your brain processing these words.
Not bad for a web of naked fancies.