Information Theory and CodingMcGraw-Hill, 1963 - 201 páginas |
Contenido
SOME PROPERTIES | 45 |
CODING INFORMATION | 65 |
514 | 133 |
Derechos de autor | |
Otras 2 secciones no mostradas
Otras ediciones - Ver todas
Términos y frases comunes
amount of information average length average number B₁ binary erasure channel binary messages binary sequence bits block code calculate cascade channel capacity channel input channel matrix channel output code alphabet code symbols code words compact code conditional probability corresponding defined encode Equation equiprobable error probability example Find I(A first-order Markov source given hamming distance Hence information channel information source information theory input alphabet input symbols instantaneous code joint entropy Kraft inequality logarithm maximum-likelihood decision rule message error message rate mutual information Note nth extension number of binits obtain output alphabet output symbols P₁ priori probabilities probability of error probability of message proof r-ary random variable received sequence reduced source S(np sequence of code Shannon's first theorem Shannon's second theorem shown in Figure Sj₂ source alphabet source symbols summation Table tion transmitted uniquely decodable codes word lengths zero zero-memory source