Information Theory and CodingMcGraw-Hill, 1963 - 201 páginas |
Otras ediciones - Ver todas
Términos y frases comunes
amount of information approximation average length average number b₁ binary code binary messages binary sequence binary source bits block code calculate cascade channel capacity channel matrix channel output code alphabet code symbols code words compact code conditional probabilities construct corresponding define definition encode Equation equiprobable ergodic error probability example first-order Markov source given hamming distance information channel information source information theory input alphabet input symbols instantaneous code joint entropy Kraft inequality log P(a logarithm maximum-likelihood decision rule message rate mth-order Markov source mutual information Note nth extension number of binits obtain original source output alphabet output symbols P(si possible posteriori probability of error proof r-ary random variable reduced source S(np sequence of code Shannon's first theorem Shannon's second theorem shown in Figure Sj₂ source alphabet source symbols summation Table tion transmitted uniquely decodable codes word lengths zero zero-memory source