A First Course in Information TheorySpringer Science & Business Media, 2002 M04 30 - 412 páginas A First Course in Information Theory is an up-to-date introduction to information theory. In addition to the classical topics discussed, it provides the first comprehensive treatment of the theory of I-Measure, network coding theory, Shannon and non-Shannon type information inequalities, and a relation between entropy and group theory. ITIP, a software package for proving information inequalities, is also included. With a large number of examples, illustrations, and original problems, this book is excellent as a textbook or reference book for a senior or graduate level course on the subject, as well as a reference for researchers in related fields. |
Contenido
I | 1 |
II | 5 |
III | 10 |
IV | 16 |
V | 17 |
VI | 19 |
VII | 23 |
VIII | 25 |
LXIX | 218 |
LXXI | 223 |
LXXII | 226 |
LXXIII | 227 |
LXXIV | 230 |
LXXV | 231 |
LXXVII | 233 |
LXXVIII | 234 |
IX | 28 |
X | 32 |
XI | 36 |
XII | 39 |
XIII | 41 |
XIV | 42 |
XV | 45 |
XVI | 48 |
XVII | 54 |
XVIII | 58 |
XIX | 59 |
XX | 61 |
XXI | 64 |
XXII | 66 |
XXIII | 68 |
XXIV | 70 |
XXV | 71 |
XXVI | 73 |
XXVII | 81 |
XXVIII | 82 |
XXIX | 92 |
XXX | 93 |
XXXI | 94 |
XXXII | 95 |
XXXIII | 96 |
XXXIV | 97 |
XXXV | 100 |
XXXVI | 103 |
XXXVII | 105 |
XXXVIII | 112 |
XXXIX | 119 |
XL | 121 |
XLI | 124 |
XLII | 125 |
XLIII | 126 |
XLIV | 135 |
XLV | 140 |
XLVI | 143 |
XLVII | 146 |
XLVIII | 147 |
XLIX | 149 |
L | 153 |
LI | 158 |
LII | 160 |
LIII | 166 |
LIV | 171 |
LV | 174 |
LVI | 180 |
LVII | 183 |
LVIII | 186 |
LIX | 187 |
LX | 188 |
LXI | 191 |
LXII | 196 |
LXIII | 204 |
LXIV | 206 |
LXV | 212 |
LXVI | 214 |
LXVII | 215 |
LXVIII | 216 |
LXXIX | 236 |
LXXX | 240 |
LXXXI | 242 |
LXXXII | 245 |
LXXXIII | 246 |
LXXXIV | 251 |
LXXXV | 259 |
LXXXVI | 262 |
LXXXVII | 263 |
LXXXVIII | 265 |
LXXXIX | 267 |
XC | 269 |
XCII | 270 |
XCIII | 272 |
XCIV | 273 |
XCV | 276 |
XCVI | 277 |
XCVII | 278 |
XCVIII | 279 |
XCIX | 281 |
C | 283 |
CI | 284 |
CII | 285 |
CIII | 287 |
CIV | 291 |
CV | 293 |
CVI | 297 |
CVII | 298 |
CVIII | 300 |
CIX | 301 |
CX | 302 |
CXI | 310 |
CXII | 315 |
CXIII | 321 |
CXIV | 324 |
CXV | 325 |
CXVI | 327 |
CXVII | 328 |
CXIX | 330 |
CXX | 335 |
CXXII | 336 |
CXXIII | 337 |
CXXIV | 340 |
CXXV | 342 |
CXXVI | 346 |
CXXVII | 350 |
CXXVIII | 360 |
CXXIX | 361 |
CXXX | 364 |
CXXXI | 365 |
CXXXII | 366 |
CXXXIII | 372 |
CXXXIV | 377 |
CXXXV | 380 |
CXXXVI | 384 |
CXXXVII | 387 |
389 | |
403 | |
Otras ediciones - Ver todas
Términos y frases comunes
A₁ achievable algorithm alphabet b₁ b₂ basic inequalities binary channel capacity Chapter characterization code tree codeword conditional independence Consider defined DEFINITION denote distortion measure Dmax Ed(X elemental inequalities encoding function entropy function entropy rate equal example exists Fano's inequality Figure finite follows forms a Markov graph G Huffman code I-Measure implies information diagram information inequalities information rate information source information theory input ITIP joint distribution joint entropies Lemma log p(x lower bound Markov chain Markov random field max-flow bound multicast mutual information mutually independent network coding non-Shannon-type inequality nonempty atoms nonnegative obtain Pr{X Pr{Y prefix code probability of error problem PROPOSITION random variables X1 rate-distortion function satisfies Shannon-type inequalities Shannon's information measures sink nodes source coding theorem sufficiently large theorem is proved upper bound X₁ X1 and X2 Yeung zero
Pasajes populares
Página 401 - Efficient universal lossless data compression algorithms based on a greedy sequential grammar transform — Part one: Without context models,
Página 390 - ER Berlekamp, Block coding for the binary symmetric channel with noiseless, delayless feedback, In: Error-correcting Codes, HB Mann (Editor), Wiley, New York (1968), pp.
Página 392 - Linear Programming and Extensions." Princeton Univ. Press, Princeton, New Jersey, 1963. 9. RL GRAVES and P. WOLFE, "Recent Advances in Mathematical Programming.
Página 392 - The Capacity of the Arbitrarily Varying Channel Revisited: Positivity Constraints", IEEE-IT, Vol.