Information Theory and CodingMcGraw-Hill, 1963 - 201 páginas |
Otras ediciones - Ver todas
Términos y frases comunes
amount of information average length average number B₁ binary code binary erasure channel binary sequence binary source bits block code cascade channel capacity channel matrix channel output code alphabet code symbols code words compact code conditional probabilities construct corresponding defined definition encode Equation ergodic error probability example Find I(A first-order Markov source given hamming distance information channel information source information theory input alphabet input symbols instantaneous code joint entropy Kraft inequality logarithm maximum-likelihood decision rule message rate mutual information Note nth extension number of binits obtain original source output alphabet output symbols P(ai P(si P₁ possible probability of error proof r-ary random variable reduced source S(np Section sequence of code Shannon's first theorem Shannon's second theorem shown in Figure Sj₂ source alphabet source symbols statistical summation Table tion transmitted uniquely decodable codes word lengths zero zero-memory source