"The first comprehensive introduction to information theory, this book places the work begun by Shannon and continued by McMillan, Feinstein, and Khinchin on a rigorous mathematical basis. For the first time, mathematicians, statisticians, physicists, cyberneticists, and communications engineers are offered a lucid, comprehensive introduction to this rapidly growing field.
In his first paper, Dr. Khinchin develops the concept of entropy in probability theory as a measure of uncertainty of a finite """"scheme,"""" and discusses a simple application to coding theory. The second paper investigates the restrictions previously placed on the study of sources, channels, and codes and attempts """"to give a complete, detailed proof of both ... Shannon theorems, assuming any ergodic source and any stationary channel with a finite memory.""""Partial Contents: I. The Entropy Concept in Probability Theory - Entropy of Finite Schemes. The Uniqueness Theorem. Entropy of Markov chains. Application to Coding Theory. II. On the Fundamental Theorems of Information Theory - Two generalizations of Shannon's inequality. Three inequalities of Feinstein. Concept of a source. Stationarity. Entropy. Ergodic sources. The E property. The martingale concept. Noise. Anticipation and memory. Connection of the channel to the source. Feinstein's Fundamental Lemma. Coding. The first Shannon theorem. The second Shannon theorem."
By:
A. Ya. Khinchin Translated by:
Silverman, Friedman Imprint: Dover Publications Inc. Country of Publication: United States Dimensions:
Height: 202mm,
Width: 136mm,
Spine: 7mm
Weight: 148g ISBN:9780486604343 ISBN 10: 0486604349 Series:Dover Books on Mathema 1.4tics Pages: 128 Publication Date:01 June 1957 Audience:
College/higher education
,
Professional and scholarly
,
Postgraduate, Research & Scholarly
,
A / AS level
,
Undergraduate
Format:Paperback Publisher's Status: Unspecified