Information Theory and CodingMcGraw-Hill, 1963 - 201 páginas |
Dentro del libro
22 páginas coinciden con question en este libro.
¿Dónde está el resto de este libro?
Resultados 1-3 de 22
Contenido
Preface V | 7 |
SOME PROPERTIES | 45 |
CODING INFORMATION | 65 |
Derechos de autor | |
Otras 2 secciones no mostradas
Otras ediciones - Ver todas
Términos y frases comunes
amount of information approximation average length average number b₁ binary code binary messages binary sequence binary source bits block code calculate cascade channel capacity channel matrix channel output code alphabet code symbols code words compact code conditional probabilities construct corresponding define definition encode Equation equiprobable ergodic error probability example first-order Markov source given hamming distance information channel information theory input alphabet input symbols instantaneous code joint entropy Kraft inequality log P(a logarithm maximum-likelihood decision rule message error message rate mth-order Markov source mutual information Note nth extension number of binits obtain original source output alphabet output symbols P(ai P(si P₁ possible probability of error proof r-ary random variable reduced source S(np sequence of code Shannon's first theorem Shannon's second theorem shown in Figure Sj₂ source alphabet source symbols summation Table tion transmitted uniquely decodable codes word lengths zero zero-memory source