Goodhart's Law is a concept in economics that states: "When a measure becomes a target, it ceases to be a good measure." The law was named after British economist Charles Goodhart, who articulated this principle. It reflects a fundamental challenge in management and policy-making.
In simpler terms, Goodhart's Law suggests that when we use a specific metric to guide our actions and decisions, people will find ways to manipulate that metric to their advantage, often in ways that undermine its original purpose.
Examples
Let's illustrate this with some examples:
Education and Standardized Testing: Suppose a school system decides to focus on improving standardized test scores. Teachers might "teach to the test," focusing solely on the topics that will be evaluated and neglecting broader educational objectives. The result is that while test scores might improve, real understanding and learning might not. The measure (test scores) became a target, and thus ceased to be a good measure of overall educational quality.
Business Sales Goals: Imagine a company sets a sales target for its employees. To reach this target, salespeople might resort to aggressive tactics that could alienate customers in the long run. Again, the measure (sales figures) became a target, and in the process, it stopped being a good measure of the company's customer relationships and long-term profitability.
Healthcare and Patient Treatment Times: In an attempt to improve efficiency, a hospital might set a target for reducing patient treatment times. This could lead to staff rushing patient care to meet the target, potentially compromising the quality of care. In this case, the measure (treatment time) became a target and is no longer a good measure of efficient, high-quality patient care.
Developer performance score indicated by the number of lines committed This usually leads to a unjustifiably bloated codebase.
In all these examples, the original goal (improving education, increasing profits, improving healthcare or measuring performance) is undermined when the measure becomes the target. This is the essence of Goodhart's Law. It's a reminder that when setting targets and measures, we need to consider potential unintended consequences and stay attentive to the broader goals.