Information Theory and CodingMcGraw-Hill, 1963 - 201 páginas |
Dentro del libro
31 páginas coinciden con transmitted en este libro.
¿Dónde está el resto de este libro?
Resultados 1-3 de 31
Contenido
Preface V | 7 |
SOME PROPERTIES | 45 |
CODING INFORMATION | 65 |
Derechos de autor | |
Otras 2 secciones no mostradas
Otras ediciones - Ver todas
Términos y frases comunes
amount of information average length average number binary messages binary sequence binit errors bits block code cascade channel capacity channel input channel matrix channel output code alphabet code symbols code words compact code conditional probabilities construct corresponding defined encode Equation equiprobable error probability example Fano first-order Markov source given hamming distance Hence information channel information source information theory input alphabet input symbols instantaneous code joint entropy Kraft inequality log P(a maximum-likelihood decision rule message error message rate mutual information Note nth extension number of binits obtain output alphabet output symbols P(ai P(si P₁ priori probabilities probability of error probability of message proof r-ary random variable received sequence reduced source S(np sequence of code Shannon's first theorem Shannon's second theorem shown in Figure Sj₂ source alphabet source symbols summation Table tion transmitted uniquely decodable codes word lengths zero zero-memory source