Basic Principles

Goodhart's Law

when a measure becomes a target, it ceases to be a good measure.”

A famous example of this is what is now called the ‘cobra effect’. The story goes as follows: in India, under British rule, the Colonial government was concerned about the number of venomous cobras in Delhi. The government thought it was a good idea to recruit the local populace in its efforts to reduce the number of snakes, and started offering a bounty for every dead cobra brought to its door.

Initially, this was a successful strategy: people came in with large numbers of slaughtered snakes. But as time passed, enterprising individuals began to breed cobras with the intention of killing them later, for the extra income.

When the British government discovered this, they scrapped the bounty, the cobra breeders released their cobras into the wild, and Delhi experienced a boom in hooded snakes.

The Raj’s cobra problem was hence no better than when it began.

Why?

Because nearly every measurement you can think of is an imperfect reflection of the true thing you want to measure. If that metric becomes a target, then you are likely to drift from your true goals.

Because you might try to achieve that measure by means that are not good or unfair, etc. The general idea here is that an agent may optimize for a metric in a way that defeats the metric’s goal (the Cobra effect), or the agent may choose to optimize for a measure in a way that reduces that measure’s predictive effect.

How to avoid consequences of Goodhart's law

  • Keep the original purpose of the measure in mind.

  • The Principle of Pairing Indicators: combine the measurement of an effect with a measurement of its counter-effect.

Divergence of Tail

Last updated