Information Theory and CodingMcGraw-Hill, 1963 - 201 páginas |
Contenido
Preface V | 7 |
SOME PROPERTIES | 45 |
CODING INFORMATION | 65 |
Derechos de autor | |
Otras 2 secciones no mostradas
Otras ediciones - Ver todas
Términos y frases comunes
achieve alphabet amount of information approximation assume average length binary binits bits bound calculate called capacity channel matrix Chapter code alphabet code symbols code words compact code conditional Consider construct corresponding decision rule define definition depend described distribution encode English entropy equal Equation error probability example expression fact Figure Find first-order follows four function given Hence important increases inequality information channel information source information theory input symbols instantaneous code interest interpretation less Markov source measure message rate method mutual information natural necessary Note nth extension obtain occur original output symbols possible probability of error problem proof properties prove question random received reduced represent respectively satisfy sequence source alphabet source symbols symbol probabilities Table theorem tion transmitted wish word lengths zero zero-memory source