kids encyclopedia robot

Maximum likelihood facts for kids

Kids Encyclopedia Facts

Maximum likelihood estimation (or maximum likelihood) is the name used for a number of ways to guess the parameters of a parametrised statistical model. These methods pick the value of the parameter in such a way that the probability distribution makes the observed values very likely. The method was mainly devleoped by R.A.Fisher in the early 20th century. A likelihood estimation, where probabilities are known beforehand is known as Maximum a posteriori estimation.

History

Early users of maximum likelihood were Carl Friedrich Gauss, Pierre-Simon Laplace, Thorvald N. Thiele, and Francis Ysidro Edgeworth.

Youngronaldfisher2
Ronald Fisher in 1913

However its widespread use arose between 1912 and 1922 when Ronald Fisher recommended, widely popularized, and carefully analyzed maximum-likelihood estimation (with fruitless attempts at proofs).

Maximum-likelihood estimation finally transcended heuristic justification in a proof published by Samuel S. Wilks in 1938, now called "Wilks' theorem". The theorem shows that the error in the logarithm of likelihood values for estimates from multiple independent samples is asymptotically χ ² distributed, which enables convenient determination of a confidence region around any one estimate of the parameters. The only difficult part of Wilks’ proof depends on the expected value of the Fisher information matrix, which ironically is provided by a theorem proven by Fisher. Wilks continued to improve on the generality of the theorem throughout his life, with his most general proof published in 1962.

Some of the theory behind maximum likelihood estimation was developed for Bayesian statistics.

Reviews of the development of maximum likelihood estimation have been provided by a number of authors.

See also

Kids robot.svg In Spanish: Máxima verosimilitud para niños

kids search engine
Maximum likelihood Facts for Kids. Kiddle Encyclopedia.