A simple explanation of Benford's Law

R. M. Fewster


Benford's Law, also known as the first-digit law, has long been seen as a tantalizing and mysterious law of nature. Attempts to explain it range from the supernatural to the measure-theoretic, and applications range from fraud detection to computer disk space allocation. Publications on the topic have escalated in recent years, largely covering investigation of the law in different data sources, applications in fraud and computer science, and new probability theorems. The underlying reason why Benford's Law occurs is, however, elusive. Many researchers have verified for themselves that the law is widely obeyed, but have also noted that the popular explanations are not completely satisfying. In this article we do nothing rigorous, but provide a simple, intuitive explanation of why and when the law applies. It is intended that the explanation should be accessible to school students and anyone with a basic knowledge of probability density curves and logarithms.


Paper


Presentation


Class experiment


See also:



Last updated: 16th March 2009