Elements of Information Theory


Resource | v1 | created by semantic-scholar-bot |
Type Paper
Created 1991-01-01
Identifier DOI: 10.1002/0471200611

Description

Preface to the Second Edition. Preface to the First Edition. Acknowledgments for the Second Edition. Acknowledgments for the First Edition. 1. Introduction and Preview. 1.1 Preview of the Book. 2. Entropy, Relative Entropy, and Mutual Information. 2.1 Entropy. 2.2 Joint Entropy and Conditional Entropy. 2.3 Relative Entropy and Mutual Information. 2.4 Relationship Between Entropy and Mutual Information. 2.5 Chain Rules for Entropy, Relative Entropy, and Mutual Information. 2.6 Jensen's Inequality and Its Consequences. 2.7 Log Sum Inequality and Its Applications. 2.8 Data-Processing Inequality. 2.9 Sufficient Statistics. 2.10 Fano's Inequality. Summary. Problems. Historical Notes. 3. Asymptotic Equipartition Property. 3.1 Asymptotic Equipartition Property Theorem. 3.2 Consequences of the AEP: Data Compression. 3.3 High-Probability Sets and the Typical Set. Summary. Problems. Historical Notes. 4. Entropy Rates of a Stochastic Process. 4.1 Markov Chains. 4.2 Entropy Rate. 4.

Relations

is about Computer science

Computer science is the study of computation and information. Computer science deals with theory of c...

Currently, no resources are attached.


Edit resource New resource

0.0 /10
useless alright awesome
from 0 reviews
Write comment Rate resource Tip: Rating is anonymous unless you also write a comment.
Resource level 0.0 /10
beginner intermediate advanced
Resource clarity 0.0 /10
hardly clear sometimes unclear perfectly clear
Reviewer's background 0.0 /10
none basics intermediate advanced expert
Comments 0
Currently, there aren't any comments.