Preface |
|
iii | |
|
|
|
|
1 | (2) |
|
A Model for a Communication System |
|
|
3 | (2) |
|
A Quantitative Measure of Information |
|
|
5 | (2) |
|
A Binary Unit of Information |
|
|
7 | (2) |
|
|
9 | (2) |
|
Main Contributors to Information Theory |
|
|
11 | (3) |
|
An Outline of Information Theory |
|
|
14 | (5) |
Part 1: Discrete Schemes without Memory |
|
|
Basic Concepts of Probability |
|
|
|
|
19 | (2) |
|
|
21 | (2) |
|
|
23 | (1) |
|
|
24 | (6) |
|
|
30 | (4) |
|
|
34 | (2) |
|
|
36 | (2) |
|
|
38 | (2) |
|
|
40 | (2) |
|
|
42 | (2) |
|
Theorem of Multiplication |
|
|
44 | (2) |
|
|
46 | (3) |
|
Combinatorial Problems in Probability |
|
|
49 | (3) |
|
|
52 | (6) |
|
|
58 | (1) |
|
Discrete Probability Functions and Distribution |
|
|
59 | (2) |
|
Bivariate Discrete Distributions |
|
|
61 | (2) |
|
|
63 | (2) |
|
|
65 | (2) |
|
Expected Value of a Random Variable |
|
|
67 | (9) |
|
Basic Concepts of Information Theory: Memoryless Finite Schemes |
|
|
|
|
76 | (2) |
|
An Intuitive Justification |
|
|
78 | (2) |
|
Formal Requirements for the Average Uncertainty |
|
|
80 | (2) |
|
H Function as a Measure of Uncertainty |
|
|
82 | (4) |
|
An Alternative Proof That the Entropy Function Possesses a Maximum |
|
|
86 | (3) |
|
Sources and Binary Sources |
|
|
89 | (2) |
|
Measure of Information for Two-dimensional Discrete Finite Probability Schemes |
|
|
91 | (3) |
|
|
94 | (2) |
|
A Sketch of a Communication Network |
|
|
96 | (3) |
|
Derivation of the Noise Characteristics of a Channel |
|
|
99 | (2) |
|
Some Basic Relationships among Different Entropies |
|
|
101 | (3) |
|
A Measure of Mutual Information |
|
|
104 | (2) |
|
Set-theory Interpretation of Shannon's Fundamental Inequalities |
|
|
106 | (2) |
|
Redundancy, Efficiency, and Channel Capacity |
|
|
108 | (3) |
|
Capacity of Channels with Symmetric Noise Structures |
|
|
111 | (3) |
|
|
114 | (1) |
|
Capacity of Binary Channels |
|
|
115 | (7) |
|
Binary Pulse Width Communication Channel |
|
|
122 | (2) |
|
Uniqueness of the Entropy Function |
|
|
124 | (7) |
|
|
|
|
131 | (6) |
|
|
137 | (1) |
|
|
138 | (4) |
|
Necessary and Sufficient Conditions for Noiseless Coding |
|
|
142 | (5) |
|
A Theorem on Decodability |
|
|
147 | (1) |
|
Average Length of Encoded Messages |
|
|
148 | (3) |
|
Shannon's Binary Encoding |
|
|
151 | (3) |
|
Fundamental Theorem of Discrete Noiseless Coding |
|
|
154 | (1) |
|
Huffman's Minimum-redundancy Code |
|
|
155 | (3) |
|
|
158 | (2) |
|
Fundamental Theorem of Discrete Encoding in Presence of Noise |
|
|
160 | (6) |
|
Error-detecting and Error-correcting Codes |
|
|
166 | (2) |
|
Geometry of the Binary Code Space |
|
|
168 | (3) |
|
Hamming's Single-error Correcting Code |
|
|
171 | (5) |
|
Elias's Iteration Technique |
|
|
176 | (4) |
|
A Mathematical Proof of the Fundamental Theorem of Information Theory for Discrete BSC |
|
|
180 | (3) |
|
Encoding the English Alphabet |
|
|
183 | (8) |
Part 2: Continuous without Memo |
|
|
Continuous Probability Distribution and Density - |
|
|
|
|
191 | (1) |
|
Probability Distribution Functions |
|
|
192 | (2) |
|
Probability Density Function |
|
|
194 | (2) |
|
|
196 | (2) |
|
|
198 | (1) |
|
|
199 | (1) |
|
Multidimensional Random Variables |
|
|
200 | (2) |
|
Joint Distribution of Two Variables: Marginal Distribution |
|
|
202 | (2) |
|
Conditional Probability Distribution and Density |
|
|
204 | (2) |
|
Bivariate Normal Distribution |
|
|
206 | (2) |
|
Functions of Random Variables |
|
|
208 | (6) |
|
Transformation from Cartesian to Polar Coordinate System |
|
|
214 | (6) |
|
|
|
Expected Values; Discrete Case |
|
|
220 | (2) |
|
Expectation of Sums and Products of a Finite Number of Independent Discrete Random Variables |
|
|
222 | (2) |
|
Moments of a Univariate Random Variable |
|
|
224 | (3) |
|
|
227 | (2) |
|
Moments of Bivariate Random Variables |
|
|
229 | (1) |
|
|
230 | (2) |
|
Linear Combination of Random Variables |
|
|
232 | (2) |
|
Moments of Some Common Distribution Functions |
|
|
234 | (4) |
|
Characteristic Function of a Random Variable |
|
|
238 | (1) |
|
Characteristic Function and Moment-generating Function of Random Variables |
|
|
239 | (3) |
|
Density Functions of the Sum of Two Random Variables |
|
|
242 | (6) |
|
Normal Distributions and Limit Theorems |
|
|
|
Bivariate Normal Considered as an Extension of One-dimensional Normal Distribution |
|
|
248 | (2) |
|
|
250 | (2) |
|
Linear Combination of Normally Distributed Independent Random Variables |
|
|
252 | (2) |
|
|
254 | (4) |
|
A Simple Random-walk Problem |
|
|
258 | (1) |
|
Approximation of the Binomial Distribution by the Normal Distribution |
|
|
259 | (3) |
|
Approximation of Poisson Distribution by a Normal Distribution |
|
|
262 | (1) |
|
The Laws of Large Numbers |
|
|
263 | (4) |
|
Continuous Channel without Memory |
|
|
|
Definition of Different Entropies |
|
|
267 | (2) |
|
The Nature of Mathematical Difficulties Involved |
|
|
269 | (1) |
|
Infiniteness of Continuous Entropy |
|
|
270 | (3) |
|
The Variability of the Entropy in the Continuous Case with Coordinate Systems |
|
|
273 | (2) |
|
A Measure of Information in the Continuous Case |
|
|
275 | (3) |
|
Maximization of the Entropy of a Continuous Random Variable |
|
|
278 | (1) |
|
Entropy Maximization Problems |
|
|
279 | (3) |
|
|
282 | (1) |
|
Transmission of Information in Presence of Additive Noise |
|
|
283 | (2) |
|
Channel Capacity in Presence of Gaussian Additive Noise and Specified Transmitter and Noise Average Power |
|
|
285 | (2) |
|
Relation between the Entropies of Two Related Random Variables |
|
|
287 | (2) |
|
Note on the Definition of Mutual Information |
|
|
289 | (3) |
|
Transmission of Band-limited Signals |
|
|
|
|
292 | (1) |
|
Entropies of Continuous Multivariate Distributions |
|
|
293 | (2) |
|
Mutual Information of Two Gaussian Random Vectors |
|
|
295 | (2) |
|
A Channel-capacity Theorem for Additive Gaussian Noise |
|
|
297 | (2) |
|
|
299 | (1) |
|
|
300 | (5) |
|
A Physical Interpretation of the Sampling Theorem |
|
|
305 | (3) |
|
The Concept of a Vector Space |
|
|
308 | (5) |
|
Fourier-series Signal Space |
|
|
313 | (2) |
|
Band-limited Signal Space |
|
|
315 | (2) |
|
|
317 | (3) |
|
Entropies of Band-limited Ensemble in Signal Space |
|
|
320 | (2) |
|
A Mathematical Model for Communication of Continuous Signals |
|
|
322 | (1) |
|
|
323 | (2) |
|
A Lower Bound for the Probability of Error |
|
|
325 | (2) |
|
An Upper Bound for the Probability of Error |
|
|
327 | (2) |
|
Fundamental Theorem of Continuous Memoryless Channels in Presence of Additive Noise |
|
|
329 | (1) |
|
|
330 | (8) |
Part 3: Schemes with Memory |
|
|
|
|
|
338 | (3) |
|
Examples of a Stochastic Process |
|
|
341 | (2) |
|
|
343 | (1) |
|
|
344 | (3) |
|
|
347 | (2) |
|
Correlation Coefficients and Correlation Functions |
|
|
349 | (3) |
|
Example of a Normal Stochastic Process |
|
|
352 | (1) |
|
Examples of Computation of Correlation Functions |
|
|
353 | (3) |
|
Some Elementary Properties of Correlation Functions of Stationary Processes |
|
|
356 | (1) |
|
Power Spectra and Correlation Functions |
|
|
357 | (2) |
|
Response of Linear Lumped Systems to Ergodic Excitation |
|
|
359 | (4) |
|
Stochastic Limits and Convergence |
|
|
363 | (2) |
|
Stochastic Differentiation and Integration |
|
|
365 | (2) |
|
Gaussian-process Example of a Stationary Process |
|
|
367 | (1) |
|
The Over-all Mathematical Structure of the Stochastic Processes |
|
|
368 | (2) |
|
A Relation between Positive Definite Functions and Theory of Probability |
|
|
370 | (4) |
|
Communication under Stochastic Regimes |
|
|
|
Stochastic Nature of Communication |
|
|
374 | (2) |
|
|
376 | (1) |
|
A Basic Theorem on Regular Markov Chains |
|
|
377 | (3) |
|
Entropy of a Simple Markov Chain |
|
|
380 | (4) |
|
Entropy of a Discrete Stationary Source |
|
|
384 | (4) |
|
Discrete Channels with Finite Memory |
|
|
388 | (1) |
|
Connection of the Source and the Discrete Channel with Memory |
|
|
389 | (2) |
|
Connection of a Stationary Source to a Stationary Channel |
|
|
391 | (7) |
Part 4: Some Recent Developments |
|
|
The Fundamental Theorem of Information Theory |
|
|
Preliminaries |
|
|
|
398 | (1) |
|
The Probability of Error in a Decision Scheme |
|
|
398 | (2) |
|
A Relation between Error Probability and Equivocation |
|
|
400 | (2) |
|
The Extension of Discrete Memoryless Noisy Channels |
|
|
402 | (1) |
Feinstein's Proof |
|
|
On Certain Random Variables Associated with a Communication System |
|
|
403 | (2) |
|
|
405 | (1) |
|
|
406 | (3) |
Shannon's Proof |
|
|
|
409 | (3) |
|
A Relation between Transinformation and Error Probability |
|
|
412 | (2) |
|
An Exponential Bound for Error Probability |
|
|
414 | (2) |
Wolfowitz's Proof |
|
|
|
416 | (1) |
|
A Lemma and Its Application |
|
|
417 | (2) |
|
|
419 | (2) |
|
Completion of Wolfowitz's Proof |
|
|
421 | (3) |
|
|
|
|
424 | (1) |
|
|
425 | (3) |
|
|
428 | (1) |
|
Algebra for Binary n-Digit Words |
|
|
429 | (2) |
|
|
431 | (4) |
|
|
435 | (2) |
|
A Detection Scheme for Group Codes |
|
|
437 | (1) |
|
Slepian's Technique for Single-error Correcting Group Codes |
|
|
438 | (4) |
|
Further Notes on Group Codes |
|
|
442 | (4) |
|
Some Bounds on the Number of Words in a Systematic Code |
|
|
446 | (4) |
APPENDIX Additional Notes and Tables |
|
|
N-1. The Gambler with a Private Wire |
|
|
450 | (2) |
|
N-2. Some Remarks on Sampling Theorem |
|
|
452 | (2) |
|
N-3. Analytic Signals and the Uncertainty Relation |
|
|
454 | (3) |
|
N-4. Elias's Proof of the Fundamental Theorem for BSC |
|
|
457 | (3) |
|
N-5. Further Remarks on Coding Theory |
|
|
460 | (2) |
|
N-6. Partial Ordering of Channels |
|
|
462 | (2) |
|
N-7. Information Theory and Radar Problems |
|
|
464 | (1) |
|
T-1. Normal Probability Integral |
|
|
465 | (1) |
|
T-2. Normal Distributions |
|
|
466 | (1) |
|
T-3. A Summary of Some Common Probability Functions |
|
|
467 | (1) |
|
T-4. Probability of No Error for Best Group Code |
|
|
468 | (1) |
|
T-5. Parity-check Rules for Best Group Alphabets |
|
|
469 | (2) |
|
T-6. Logarithms to the Base 2 |
|
|
471 | (5) |
|
T-7. Entropy of a Discrete Binary Source |
|
|
476 | (5) |
Bibliography |
|
481 | (10) |
Name Index |
|
491 | (2) |
Subject Index |
|
493 | |