Limit this search to....

Statistical and Inductive Inference by Minimum Message Length 2005 Edition
Contributor(s): Wallace, C. S. (Author)
ISBN: 038723795X     ISBN-13: 9780387237954
Publisher: Springer
OUR PRICE:   $161.49  
Product Type: Hardcover - Other Formats
Published: May 2005
Qty:
Annotation: The Minimum Message Length (MML) Principle is an information-theoretic approach to induction, hypothesis testing, model selection, and statistical inference. MML, which provides a formal specification for the implementation of Occam's Razor, asserts that the ?best? explanation of observed data is the shortest. Further, an explanation is acceptable (i.e. the induction is justified) only if the explanation is shorter than the original data.

This book gives a sound introduction to the Minimum Message Length Principle and its applications, provides the theoretical arguments for the adoption of the principle, and shows the development of certain approximations that assist its practical application. MML appears also to provide both a normative and a descriptive basis for inductive reasoning generally, and scientific induction in particular. The book describes this basis and aims to show its relevance to the Philosophy of Science.

Statistical and Inductive Inference by Minimum Message Length will be of special interest to graduate students and researchers in Machine Learning and Data Mining, scientists and analysts in various disciplines wishing to make use of computer techniques for hypothesis discovery, statisticians and econometricians interested in the underlying theory of their discipline, and persons interested in the Philosophy of Science. The book could also be used in a graduate-level course in Machine Learning and Estimation and Model-selection, Econometrics and Data Mining.

"Any statistician interested in the foundations of the discipline, or the deeper philosophical issues of inference, will find this volume a rewarding read." Short Book Reviews of the InternationalStatistical Institute, December 2005

Additional Information
BISAC Categories:
- Mathematics | Probability & Statistics - General
- Computers | Information Theory
- Computers | Computer Science
Dewey: 519.5
LCCN: 2004059195
Series: Information Science and Statistics
Physical Information: 0.99" H x 6.16" W x 9.56" (1.65 lbs) 432 pages
 
Descriptions, Reviews, Etc.
Publisher Description:
Mythanksareduetothemanypeoplewhohaveassistedintheworkreported here and in the preparation of this book. The work is incomplete and this account of it rougher than it might be. Such virtues as it has owe much to others; the faults are all mine. MyworkleadingtothisbookbeganwhenDavidBoultonandIattempted to develop a method for intrinsic classi?cation. Given data on a sample from some population, we aimed to discover whether the population should be considered to be a mixture of di?erent types, classes or species of thing, and, if so, how many classes were present, what each class looked like, and which things in the sample belonged to which class. I saw the problem as one of Bayesian inference, but with prior probability densities replaced by discrete probabilities re?ecting the precision to which the data would allow parameters to be estimated. Boulton, however, proposed that a classi?cation of the sample was a way of brie?y encoding the data: once each class was described and each thing assigned to a class, the data for a thing would be partially implied by the characteristics of its class, and hence require little further description. After some weeks' arguing our cases, we decided on the maths for each approach, and soon discovered they gave essentially the same results. Without Boulton's insight, we may never have made the connection between inference and brief encoding, which is the heart of this work.