Benford's Law, also known as the first-digit law, has long been seen as a tantalizing and mysterious law of nature. Attempts to explain it range from the supernatural to the measure-theoretic, and applications range from fraud detection to computer disk space allocation. Publications on the topic have escalated in recent years, largely covering investigation of the law in different data sources, applications in fraud and computer science, and new probability theorems. The underlying reason why Benford's Law occurs is, however, elusive. Many researchers have verified for themselves that the law is widely obeyed, but have also noted that the popular explanations are not completely satisfying. In this article we do nothing rigorous, but provide a simple, intuitive explanation of why and when the law applies. It is intended that the explanation should be accessible to school students and anyone with a basic knowledge of probability density curves and logarithms.
A simple explanation of Benford's Law, by Rachel Fewster.
The American Statistician, 63, 26-32, 2009.
Full text (PDF).
The American Statistician Website
Here is a talk on Benford's Law, delivered at the Statistics Teachers Day workshop,
University of Auckland, November 2008:
How to Fake Data If You Must.
Can you spot the cheats using Benford's Law? Here's a classroom activity with instructions
and R code.
Catch the Cheats!