Goodhart’s law

Any observed statistical regularity will tend to collapse once pressure is placed upon it for control purposes.” Charles Goodhart

Image created with Midjourney. Prompt:
Image created with Midjourney. Prompt: An 2D minimal style illustration showing a large, bustling cityscape. At the heart of the city, a giant, glass and steel metric ruler reaching up towards the sky, casting a long shadow. The citizens, portrayed as tiny figures, are all converging on the ruler, trying to push it higher and higher. Despite their efforts, the buildings around the ruler are starting to warp and bend, showing the distortion effects mentioned in Goodhart's Law. The color palette is vibrant with dominant hues of blue and silver, reflecting a cool, urban atmosphere. The lighting should be twilight, with the setting sun in the background, creating dramatic contrasts. --v 5.1

Goodhart's Law is a concept in economics that states: "When a measure becomes a target, it ceases to be a good measure." The law was named after British economist Charles Goodhart, who articulated this principle. It reflects a fundamental challenge in management and policy-making.

In simpler terms, Goodhart's Law suggests that when we use a specific metric to guide our actions and decisions, people will find ways to manipulate that metric to their advantage, often in ways that undermine its original purpose.


Let's illustrate this with some examples:

Education and Standardized Testing: Suppose a school system decides to focus on improving standardized test scores. Teachers might "teach to the test," focusing solely on the topics that will be evaluated and neglecting broader educational objectives. The result is that while test scores might improve, real understanding and learning might not. The measure (test scores) became a target, and thus ceased to be a good measure of overall educational quality.

Business Sales Goals: Imagine a company sets a sales target for its employees. To reach this target, salespeople might resort to aggressive tactics that could alienate customers in the long run. Again, the measure (sales figures) became a target, and in the process, it stopped being a good measure of the company's customer relationships and long-term profitability.

Healthcare and Patient Treatment Times: In an attempt to improve efficiency, a hospital might set a target for reducing patient treatment times. This could lead to staff rushing patient care to meet the target, potentially compromising the quality of care. In this case, the measure (treatment time) became a target and is no longer a good measure of efficient, high-quality patient care.

Developer performance score indicated by the number of lines committed This usually leads to a unjustifiably bloated codebase.

In all these examples, the original goal (improving education, increasing profits, improving healthcare or measuring performance) is undermined when the measure becomes the target. This is the essence of Goodhart's Law. It's a reminder that when setting targets and measures, we need to consider potential unintended consequences and stay attentive to the broader goals.

More sources around the topic