Entropy and information theory

Автор(ы):Gray R. M.
06.10.2007
Год изд.:1990
Описание: This book is devoted to the theory of probabilistic information measures and their application to coding theorems for information sources and noisy channels. The eventual goal is a general development of Shannon's mathematical theory of communication, but much of the space is devoted to the tools and methods required to prove the Shannon coding theorems. These tools form an area common to ergodic theory and information theory and comprise several quantitative notions of the information in random variables, random processes, and dynamical systems. Examples are entropy, mutual information, conditional entropy, conditional information, and discrimination or relative entropy, along with the limiting normalized versions of these quantities such as entropy rate and information rate. Much of the book is concerned with their properties, especially the long term asymptotic behavior of sample information and expected information.
Оглавление:
Entropy and information theory — обложка книги. Обложка книги.
1 Information Sources [1]
  1.1 Introduction [1]
  1.2 Probability Spaces and Random Variables [1]
  1.3 Random Processes and Dynamical Systems [5]
  1.4 Distributions [6]
  1.5 Standard Alphabets [10]
  1.6 Expectation [11]
  1.7 Asymptotic Mean Stationarity [14]
  1.8 Ergodic Properties [15]
2 Entropy and Information [17]
  2.1 Introduction [17]
  2.2 Entropy and Entropy Rate [17]
  2.3 Basic Properties of Entropy [20]
  2.4 Entropy Rate [31]
  2.5 Conditional Entropy and Information [35]
  2.6 Entropy Rate Revisited [41]
  2.7 Relative Entropy Densities [44]
3 The Entropy Ergodic Theorem [47]
  3.1 Introduction [47]
  3.2 Stationary Ergodic Sources [50]
  3.3 Stationary Nonergodic Sources [56]
  3.4 AMS Sources [59]
  3.5 The Asymptotic Equipartition Property [63]
4 Information Rates I [65]
  4.1 Introduction [65]
  4.2 Stationary Codes and Approximation [65]
  4.3 Information Rate of Finite Alphabet Processes [73]
5 Relative Entropy [77]
  5.1 Introduction [77]
  5.2 Divergence [77]
  5.3 Conditional Relative Entropy [92]
  5.4 Limiting Entropy Densities [104]
  5.5 Information for General Alphabets [106]
  5.6 Some Convergence Results [116]
6 Information Rates II [119]
  6.1 Introduction [119]
  6.2 Information Rates for General Alphabets [119]
  6.3 A Mean Ergodic Theorem for Densities [122]
  6.4 Information Rates of Stationary Processes [124]
7 Relative Entropy Rates [131]
  7.1 Introduction [131]
  7.2 Relative Entropy Densities and Rates [131]
  7.3 Markov Dominating Measures [134]
  7.4 Stationary Processes [137]
  7.5 Mean Ergodic Theorems [140]
8 Ergodic Theorems for Densities [145]
  8.1 Introduction [145]
  8.2 Stationary Ergodic Sources [145]
  8.3 Stationary Nonergodic Sources [150]
  8.4 AMS Sources [153]
  8.5 Ergodic Theorems for Information Densities [156]
9 Channels and Codes [159]
  9.1 Introduction [159]
  9.2 Channels [160]
  9.3 Stationarity Properties of Channels [162]
  9.4 Examples of Channels [165]
  9.5 The Rohlin-Kakutani Theorem [185]
10 Distortion [191]
  10.1 Introduction [191]
  10.2 Distortion and Fidelity Criteria [191]
  10.3 Performance [193]
  10.4 The rho-bar distortion [195]
  10.5 d-bar Continuous Channels [197]
  10.6 The Distortion-Rate Function [201]
11 Source Coding Theorems [211]
  11.1 Source Coding and Channel Coding [211]
  11.2 Block Source Codes for AMS Sources [211]
  11.3 Block Coding Stationary Sources [221]
  11.4 Block Coding AMS Ergodic Sources [222]
  11.5 Subadditive Fidelity Criteria [228]
  11.6 Asynchronous Block Codes [230]
  11.7 Sliding Block Source Codes [232]
  11.8 A Geometric Interpretation of OPTA's [241]
12 Coding for noisy channels [243]
  12.1 Noisy Channels [243]
  12.2 Feinstein's Lemma [244]
  12.3 Feinstein's Theorem [247]
  12.4 Channel Capacity [249]
  12.5 Robust Block Codes [254]
  12.6 Block Coding Theorems for Noisy Channels [257]
  12.7 Joint Source and Channel Block Codes [258]
  12.8 Synchronizing Block Channel Codes [261]
  12.9 Sliding Block Source and Channel Coding [265]
Bibliography [275]
Index [284]
Формат: djvu
Размер:1089603 байт
Язык:ENG
Рейтинг: 24 Рейтинг
Открыть: Ссылка (RU)