← Back to Library
Wikipedia Deep Dive

Hick's law

Based on Wikipedia: Hick's law

Every time you stare at a restaurant menu with forty-seven options and feel your brain slowly grinding to a halt, you're experiencing one of the most elegant findings in experimental psychology. It turns out your mental paralysis isn't random frustration—it follows a precise mathematical pattern that two researchers discovered in the early 1950s.

The Discovery That Changed How We Think About Thinking

In 1952, a British psychologist named William Edmund Hick published a deceptively simple experiment. He arranged ten lamps in a circle around his test subjects and gave them ten telegraph keys, one for each finger. When a lamp lit up, subjects pressed the corresponding key as fast as they could. A running tape of pre-punched holes determined which lamp would light next, creating a random sequence that kept participants guessing.

What Hick found was surprising.

When he changed the number of possible lamps that might light up—sometimes two, sometimes four, sometimes all ten—the reaction times didn't increase in the way you might expect. If you doubled the number of choices, the time to respond didn't double. Instead, it increased by a fixed amount. Triple the choices? Add that same fixed amount again. The relationship was logarithmic, not linear.

Around the same time, an American psychologist named Ray Hyman was running similar experiments at Johns Hopkins University. Hyman's setup was different—he used eight lights in a grid and asked subjects to say the name of whichever light turned on. But his results confirmed what Hick had found. The relationship between the number of choices and reaction time followed a consistent mathematical pattern.

Together, their work became known as Hick's law, or sometimes the Hick-Hyman law. And its implications reach far beyond psychology laboratories.

Why Logarithms Matter Here

To understand why this finding is so interesting, you need to appreciate what a logarithmic relationship actually means in practical terms.

Imagine you're trying to guess a number between one and a hundred. If you approach this linearly—checking each number one by one—you might need up to a hundred guesses. But if you use a smarter strategy, you can do it in seven guesses or fewer. You start by asking: is it above or below fifty? Then you halve the remaining possibilities again. And again. Each question eliminates half of what's left.

This is binary search, and it's exactly how our brains seem to approach choices, at least according to Hick's law. We don't evaluate options one at a time. We categorize. We subdivide. We eliminate whole branches of possibilities with each mental step.

The mathematical formula Hick derived expresses this beautifully. The time to choose among n equally probable options is proportional to log base two of n plus one. That "plus one" accounts for something subtle: even before you choose which option, you have to decide whether to respond at all. There's always an implicit choice hiding beneath the explicit ones.

The Deeper Pattern

What happens when the choices aren't equally likely? This is where the law gets more nuanced and more interesting.

Hyman showed that the crucial factor isn't the raw number of choices but rather the information content of the decision—a concept borrowed directly from information theory, the mathematical framework developed by Claude Shannon at Bell Labs. When some options are more probable than others, the effective number of choices decreases. If you know that one light comes on ninety percent of the time, having nine other possibilities barely affects your reaction speed.

This generalization uses entropy, a measure of uncertainty. High entropy means high uncertainty—all options genuinely seem possible. Low entropy means the outcome is mostly predictable. Your reaction time tracks with entropy, not with the simple count of options.

Think about typing on a keyboard. In theory, you have twenty-six letter choices plus punctuation. But after typing "qu," the number of realistic options collapses. In English, "u" follows "q" almost universally. Your brain exploits these patterns constantly, which is why experienced typists can achieve speeds that would be impossible if they truly considered every key equally.

When the Law Breaks Down

No scientific law is universal, and Hick's law has fascinating exceptions that reveal as much as the rule itself.

When researchers studied verbal responses to familiar stimuli—words that subjects knew intimately—the logarithmic relationship weakened or disappeared entirely. Recognizing your own name in a list of twenty names takes about the same time as recognizing it in a list of five. Familiarity creates shortcuts that bypass the normal decision architecture.

Eye movements present another exception. Saccades—the rapid jumps your eyes make when scanning a scene—don't follow Hick's law at all. In some studies, having more possible targets actually decreased the time to initiate an eye movement, an antagonistic effect that inverts the expected pattern. The visual system apparently operates under different rules than conscious choice.

There's also the question of structured versus random sequences. If choices appear in predictable patterns, people learn those patterns and react faster than Hick's law would predict. Recent research suggests the relationship between predictability and reaction time isn't even linear with entropy—it's sigmoid, following an S-curve with different modes of processing at different predictability levels.

The Intelligence Connection

In 1964, a researcher named E. Roth discovered something provocative: the slope of someone's Hick's law curve correlates with their scores on intelligence tests.

Here's what that means. Everyone's reaction time increases as choices multiply, but some people's times increase more steeply than others. The steepness of that increase—how much each additional choice slows you down—relates inversely to measured IQ. People with higher scores show flatter slopes; they're less affected by additional options.

This finding sparked decades of research into the relationship between information processing speed and intelligence. It's still debated whether the correlation reflects something fundamental about cognitive capacity or merely captures shared variance with other factors. But the connection exists, and it suggests that Hick's law taps into something basic about how minds work.

Design Implications

If you've ever used a well-designed piece of software, you've benefited from Hick's law, whether the designers knew the term or not.

Consider a menu with a hundred options. Presented as a flat list, finding any particular item requires scanning through linearly. You have to look at each option until you find yours. That's not what Hick's law describes—it's worse. The law applies to choice among known alternatives, not search through unknown ones.

But reorganize that menu into categories, and something changes. Alphabetize it, or group items by function, and users can employ the subdividing strategy that Hick discovered. Is your target in the first half or second half? First quarter or second? The hierarchy enables logarithmic search, collapsing a hundred-option problem into something manageable.

This explains why dropdown menus with a dozen well-organized items often feel faster than menus with eight randomly arranged ones. It's not just about the count—it's about the structure. Hick's law is really a law about how we navigate hierarchies of choice.

Stimulus-Response Compatibility

There's another factor that modulates choice reaction time: how naturally the response maps to the stimulus.

Turning a steering wheel to turn car wheels is highly compatible. The action mirrors the effect. Pressing a button to toggle a distant light is less compatible—there's no obvious physical relationship between what you do and what happens.

When stimulus and response are compatible, Hick's law effects shrink. The logarithmic relationship still holds, but the whole curve shifts downward. Incompatible mappings inflate reaction times across the board and may steepen the slope.

This matters for interface design. Controllers that feel intuitive—where the mapping between action and outcome seems natural—reduce cognitive load. Users can respond faster because they don't have to translate between incompatible representations. Video game designers have known this intuitively for decades. The research confirms why some control schemes feel effortless while others remain frustrating no matter how much you practice.

The Paradox of Choice

Hick's law intersects with another concept that has entered popular consciousness: the paradox of choice, popularized by psychologist Barry Schwartz. His argument is that more options, beyond slowing decisions, can make us less satisfied with whatever we choose.

The mechanisms are different. Hick's law describes reaction time—a measurable quantity in milliseconds. The paradox of choice addresses satisfaction, regret, and the psychological burden of imagining paths not taken. But they point toward the same practical conclusion: proliferating options has costs that aren't obvious.

A restaurant with three hundred menu items isn't three hundred times better than one with thirty. It might actually be worse. Not just because scanning takes longer—that's the linear search problem—but because the decision process itself becomes exhausting. Each additional option is another branch on the tree you have to mentally prune.

Fitts's Law: The Physical Counterpart

Hick's law has a famous sibling: Fitts's law, which describes the time required to move to a target.

Where Hick's law governs cognitive choice, Fitts's law governs physical action. The time to click on a button depends on how far away it is and how small it is. Distant, tiny targets take longer than nearby, large ones. The relationship is, once again, logarithmic.

This isn't coincidence. Both laws describe tasks where performance improves through iterative refinement. In Hick's law, you narrow down choices by successive elimination. In Fitts's law, you narrow down position through successive corrections. The binary search of the mind mirrors the targeting adjustments of the hand.

Together, these laws form the foundation of quantitative models of human-computer interaction. If you want to predict how long it takes someone to find and click a menu item, you need both: Hick for the decision, Fitts for the movement. Multiply enough interface elements by enough users by enough interactions per day, and these milliseconds compound into hours of collective human time saved or wasted.

From Telegraph Keys to Touch Screens

It's worth pausing to appreciate the journey from Hick's original experiment to modern applications.

He was using telegraph keys—Morse code equipment that would have been familiar to nineteenth-century operators. His data recording system used "4-bit binary," punched paper tapes and electric pens scratching marks on moving paper. The subjects sat surrounded by lamps in a circle, tapping out responses like Victorian telegraphers.

Yet the relationship he found persists on devices he couldn't have imagined. When you hesitate over a smartphone app grid, your brain is performing the same subdividing search. When a video game presents you with eight weapons and you're fractionally slower to choose than when you have four, that's the Hick-Hyman law in action.

The stability of this finding across seventy years and revolutionary technological change suggests it captures something fundamental about human cognition. Not a quirk of 1950s experimental apparatus but a basic constraint on how decisions unfold in neural tissue.

What the Law Teaches Us

Hick's law is ultimately a story about limits—specifically, about the limits of conscious attention and the strategies we use to work within them.

We cannot process unlimited options simultaneously. But we don't process them one by one either. We impose structure. We categorize. We eliminate. The logarithmic relationship emerges from this hierarchical processing strategy, repeated over and over until a single option remains.

Understanding this changes how you might approach any situation involving choice. Want to help someone decide faster? Don't just reduce options—organize them. Create categories that enable binary subdivision. Make the structure visible.

Want to understand your own decision paralysis? Recognize that each doubling of options adds the same fixed burden to your processing time. Ten choices aren't ten times harder than one—they're only about three times harder. But a hundred choices are about seven times harder. And a thousand are about ten times harder. The increases slow down, but they never stop.

Perhaps the deepest lesson is about the relationship between information and time. Hick's law says that processing information takes time, and the time it takes is proportional to the amount of information being processed. This sounds almost tautological. But the precision of the relationship—the fact that it's logarithmic, the fact that it's measurable, the fact that it holds across such diverse conditions—transforms a commonsense intuition into a quantitative tool.

Next time you're frozen in front of too many options, remember: your brain is doing exactly what it should, subdividing and eliminating at its characteristic rate. The question isn't whether to eliminate the pause—that's built into the hardware. The question is whether the choice architecture around you helps or hinders the search. Hick and Hyman gave us the formula. What we build with it is up to us.

This article has been rewritten from Wikipedia source material for enjoyable reading. Content may have been condensed, restructured, or simplified.