LOW FLAT RATE AUST-WIDE $9.90 DELIVERY INFO

Close Notification

Your cart does not contain any items

$211.95

Hardback

Not in-store but you can order this
How long will it take?

QTY:

English
Cambridge University Press
07 June 2012
This book presents a comprehensive and consistent theory of estimation. The framework described leads naturally to a generalized maximum capacity estimator. This approach allows the optimal estimation of real-valued parameters, their number and intervals, as well as providing common ground for explaining the power of these estimators. Beginning with a review of coding and the key properties of information, the author goes on to discuss the techniques of estimation and develops the generalized maximum capacity estimator, based on a new form of Shannon's mutual information and channel capacity. Applications of this powerful technique in hypothesis testing and denoising are described in detail. Offering an original and thought-provoking perspective on estimation theory, Jorma Rissanen's book is of interest to graduate students and researchers in the fields of information theory, probability and statistics, econometrics and finance.

By:  
Imprint:   Cambridge University Press
Country of Publication:   United Kingdom
Dimensions:   Height: 254mm,  Width: 178mm,  Spine: 12mm
Weight:   510g
ISBN:   9781107004740
ISBN 10:   1107004748
Pages:   170
Publication Date:  
Audience:   Professional and scholarly ,  Undergraduate
Format:   Hardback
Publisher's Status:   Active

Jorma Rissanen was a member of research staff in IBM Almaden Research Center from 1965 to 2001 and is currently Professor Emeritus at Technical University of Tampere, Finland. Among his main achievements are the introduction of the MDL principle for statistics, the invention of arithmetic coding and the introduction of variable-length Markov chains with the associated Algorithm Context. He has received many awards, including the 2007 Kolmogorov medal from the CLRC, University of London, and the 2009 Shannon Award from the Information Theory Society. He received two Outstanding Innovation Awards from IBM in 1980 and 1988 and a Corporate Award in 1991.

Reviews for Optimal Estimation of Parameters

Advance praise: 'The minimum description length (MDL) principle is a very universal principle of statistical modeling in estimation, prediction, testing, and coding. Jorma Rissanen, the pioneer of the MDL principle, evolves a new theory to reach the most general and complete notion, which he calls the complete MDL principle. In this book the author derives it by introducing the key notion of maximum capacity. The most fundamental methods of estimation such as maximum likelihood estimation and the MDL estimation are naturally derived as the maximum capacity estimators, and their optimality is justified within a unifying theoretical framework. Through the book, readers can revisit the meaning of estimation from the author's very original viewpoint, and will enjoy the most advanced version of the MDL principle.' Kenji Yamanishi, University of Tokyo 'In this splendid new book, Jorma Rissanen, the originator of the minimum description length (MDL) principle, puts forward a comprehensive theory of estimation which differs in several ways from the standard Bayesian and frequentist approaches. During the development of MDL over the last 30 years, it gradually emerged that MDL could be viewed, informally, as a maximum probability principle that directly extends Fisher's classical maximum likelihood method to allow for estimation of a model's structural properties. Yet providing a formal link between MDL and maximum probability remained elusive until the arrival of this book. By making the connection mathematically precise, Rissanen now ties up the loose ends of MDL theory and at the same time develops a beautiful, unified, entirely original and fully coherent theory of estimation, which includes hypothesis testing as a special case.' Peter Grunwald, Centrum voor Wiskunde en Informatica, The Netherlands


See Also