Information Theory and CodingMcGraw-Hill, 1963 - 201 páginas |
Otras ediciones - Ver todas
Términos y frases comunes
achieve alphabet amount of information apply approximation assume average length binary binits bits bound calculate called capacity cascade channel matrix Chapter code alphabet code symbols code words compact code conditional Consider construct corresponding decision rule defined definition depend described distribution encode English entropy equal Equation error probability example expression fact Figure Find first-order follows four function given Hence important increases inequality information channel information source information theory input alphabet input symbols instantaneous code interest interpretation less Markov source measure method mutual information natural necessary Note nth extension observation obtain occur original output symbols possible probability of error Problem proof properties prove question received represent respectively satisfy sequence source symbols statistical symbol probabilities Table theorem tion transmitted wish word lengths zero zero-memory source