Information TheoryCourier Corporation, 2012 M06 14 - 352 páginas Developed by Claude Shannon and Norbert Wiener in the late 1940s, information theory, or statistical communication theory, deals with the theoretical underpinnings of a wide range of communication devices: radio, television, radar, computers, telegraphy, and more. This book is an excellent introduction to the mathematics underlying the theory. Designed for upper-level undergraduates and first-year graduate students, the book treats three major areas: analysis of channel models and proof of coding theorems (chapters 3, 7, and 8); study of specific coding systems (chapters 2, 4, and 5); and study of statistical properties of information sources (chapter 6). Among the topics covered are noiseless coding, the discrete memoryless channel, effort correcting codes, information sources, channels with memory, and continuous channels. The author has tried to keep the prerequisites to a minimum. However, students should have a knowledge of basic probability theory. Some measure and Hilbert space theory is helpful as well for the last two sections of chapter 8, which treat time-continuous channels. An appendix summarizes the Hilbert space background and the results from the theory of stochastic processes necessary for these sections. The appendix is not self-contained but will serve to pinpoint some of the specific equipment needed for the analysis of time-continuous channels. In addition to historic notes at the end of each chapter indicating the origin of some of the results, the author has also included 60 problems with detailed solutions, making the book especially valuable for independent study. |
Otras ediciones - Ver todas
Términos y frases comunes
alphabet associated assume average probability binary matrix binary sequences binary symmetric channel bound channel capacity check digits code word code-word length codes 2"R coding theorem column construct converges corrector corresponding coset cyclic code decision scheme decoding set define definition discrete memoryless channel e-tuple eigenvalue equality error pattern example field find finite first fixed function fundamental theorem given H(YI Hamming bound hence I(X I Y input distribution input sequence instantaneous code Lemma linearly independent Markov chain Markov source mathematical memoryless channel minimal polynomial modulo n-sequences nonzero parity check code parity check matrix positive integer probability of error problem proof of Theorem prove random variables real number received sequence result satisfies Section sequence of codes sequences of length specified stationary distribution steady state probabilities Suppose symbols theory transmission rate transmitted uncertainty uniquely decipherable values vector word length words of length zero