An Introduction to Information Theory

by
Format: Paperback
Pub. Date: 2010-07-21
Publisher(s): Dover Publications
  • Free Shipping Icon

    This Item Qualifies for Free Shipping!*

    *Excludes marketplace orders.

List Price: $24.10

Buy New

Arriving Soon. Will ship when available.
$22.95

Rent Book

Select for Price
There was a problem. Please try again later.

Rent Digital

Rent Digital Options
Online:1825 Days access
Downloadable:Lifetime Access
$27.54
$27.54

Used Book

We're Sorry
Sold Out

How Marketplace Works:

  • This item is offered by an independent seller and not shipped from our warehouse
  • Item details like edition and cover design may differ from our description; see seller's comments before ordering.
  • Sellers much confirm and ship within two business days; otherwise, the order will be cancelled and refunded.
  • Marketplace purchases cannot be returned to eCampus.com. Contact the seller directly for inquiries; if no response within two days, contact customer service.
  • Additional shipping costs apply to Marketplace purchases. Review shipping costs at checkout.

Summary

Graduate-level study for engineering students presents elements of modern probability theory, elements of information theory with emphasis on its basic roots in probability theory and elements of coding theory. Emphasis is on such basic concepts as sets, sample space, random variables, information measure, and capacity. Many reference tables and extensive bibliography. 1961 edition.

Table of Contents

Preface iii
Introduction
Communication Processes
1(2)
A Model for a Communication System
3(2)
A Quantitative Measure of Information
5(2)
A Binary Unit of Information
7(2)
Sketch of the Plan
9(2)
Main Contributors to Information Theory
11(3)
An Outline of Information Theory
14(5)
Part 1: Discrete Schemes without Memory
Basic Concepts of Probability
Intuitive Background
19(2)
Sets
21(2)
Operations on Sets
23(1)
Algebra of Sets
24(6)
Functions
30(4)
Sample Space
34(2)
Probability Measure
36(2)
Frequency of Events
38(2)
Theorem of Addition
40(2)
Conditional Probability
42(2)
Theorem of Multiplication
44(2)
Bayes's Theorem
46(3)
Combinatorial Problems in Probability
49(3)
Trees and State Diagrams
52(6)
Random Variables
58(1)
Discrete Probability Functions and Distribution
59(2)
Bivariate Discrete Distributions
61(2)
Binomial Distribution
63(2)
Poisson's Distribution
65(2)
Expected Value of a Random Variable
67(9)
Basic Concepts of Information Theory: Memoryless Finite Schemes
A Measure of Uncertainty
76(2)
An Intuitive Justification
78(2)
Formal Requirements for the Average Uncertainty
80(2)
H Function as a Measure of Uncertainty
82(4)
An Alternative Proof That the Entropy Function Possesses a Maximum
86(3)
Sources and Binary Sources
89(2)
Measure of Information for Two-dimensional Discrete Finite Probability Schemes
91(3)
Conditional Entropies
94(2)
A Sketch of a Communication Network
96(3)
Derivation of the Noise Characteristics of a Channel
99(2)
Some Basic Relationships among Different Entropies
101(3)
A Measure of Mutual Information
104(2)
Set-theory Interpretation of Shannon's Fundamental Inequalities
106(2)
Redundancy, Efficiency, and Channel Capacity
108(3)
Capacity of Channels with Symmetric Noise Structures
111(3)
BSC and BEC
114(1)
Capacity of Binary Channels
115(7)
Binary Pulse Width Communication Channel
122(2)
Uniqueness of the Entropy Function
124(7)
Elements of Encoding
The Purpose of Encoding
131(6)
Separable Binary Codes
137(1)
Shannon-Fano Encoding
138(4)
Necessary and Sufficient Conditions for Noiseless Coding
142(5)
A Theorem on Decodability
147(1)
Average Length of Encoded Messages
148(3)
Shannon's Binary Encoding
151(3)
Fundamental Theorem of Discrete Noiseless Coding
154(1)
Huffman's Minimum-redundancy Code
155(3)
Gilbert-Moore Encoding
158(2)
Fundamental Theorem of Discrete Encoding in Presence of Noise
160(6)
Error-detecting and Error-correcting Codes
166(2)
Geometry of the Binary Code Space
168(3)
Hamming's Single-error Correcting Code
171(5)
Elias's Iteration Technique
176(4)
A Mathematical Proof of the Fundamental Theorem of Information Theory for Discrete BSC
180(3)
Encoding the English Alphabet
183(8)
Part 2: Continuous without Memo
Continuous Probability Distribution and Density -
Continuous Sample Space
191(1)
Probability Distribution Functions
192(2)
Probability Density Function
194(2)
Normal Distribution
196(2)
Cauchy's Distribution
198(1)
Exponential Distribution
199(1)
Multidimensional Random Variables
200(2)
Joint Distribution of Two Variables: Marginal Distribution
202(2)
Conditional Probability Distribution and Density
204(2)
Bivariate Normal Distribution
206(2)
Functions of Random Variables
208(6)
Transformation from Cartesian to Polar Coordinate System
214(6)
Statistical Averages
Expected Values; Discrete Case
220(2)
Expectation of Sums and Products of a Finite Number of Independent Discrete Random Variables
222(2)
Moments of a Univariate Random Variable
224(3)
Two Inequalities
227(2)
Moments of Bivariate Random Variables
229(1)
Correlation Coefficient
230(2)
Linear Combination of Random Variables
232(2)
Moments of Some Common Distribution Functions
234(4)
Characteristic Function of a Random Variable
238(1)
Characteristic Function and Moment-generating Function of Random Variables
239(3)
Density Functions of the Sum of Two Random Variables
242(6)
Normal Distributions and Limit Theorems
Bivariate Normal Considered as an Extension of One-dimensional Normal Distribution
248(2)
Multinormal Distribution
250(2)
Linear Combination of Normally Distributed Independent Random Variables
252(2)
Central-limit Theorem
254(4)
A Simple Random-walk Problem
258(1)
Approximation of the Binomial Distribution by the Normal Distribution
259(3)
Approximation of Poisson Distribution by a Normal Distribution
262(1)
The Laws of Large Numbers
263(4)
Continuous Channel without Memory
Definition of Different Entropies
267(2)
The Nature of Mathematical Difficulties Involved
269(1)
Infiniteness of Continuous Entropy
270(3)
The Variability of the Entropy in the Continuous Case with Coordinate Systems
273(2)
A Measure of Information in the Continuous Case
275(3)
Maximization of the Entropy of a Continuous Random Variable
278(1)
Entropy Maximization Problems
279(3)
Gaussian Noisy Channels
282(1)
Transmission of Information in Presence of Additive Noise
283(2)
Channel Capacity in Presence of Gaussian Additive Noise and Specified Transmitter and Noise Average Power
285(2)
Relation between the Entropies of Two Related Random Variables
287(2)
Note on the Definition of Mutual Information
289(3)
Transmission of Band-limited Signals
Introduction
292(1)
Entropies of Continuous Multivariate Distributions
293(2)
Mutual Information of Two Gaussian Random Vectors
295(2)
A Channel-capacity Theorem for Additive Gaussian Noise
297(2)
Digression
299(1)
Sampling Theorem
300(5)
A Physical Interpretation of the Sampling Theorem
305(3)
The Concept of a Vector Space
308(5)
Fourier-series Signal Space
313(2)
Band-limited Signal Space
315(2)
Band-limited Ensembles
317(3)
Entropies of Band-limited Ensemble in Signal Space
320(2)
A Mathematical Model for Communication of Continuous Signals
322(1)
Optimal Decoding
323(2)
A Lower Bound for the Probability of Error
325(2)
An Upper Bound for the Probability of Error
327(2)
Fundamental Theorem of Continuous Memoryless Channels in Presence of Additive Noise
329(1)
Thomasian's Estimate
330(8)
Part 3: Schemes with Memory
Stochastic Processes
Stochastic Theory
338(3)
Examples of a Stochastic Process
341(2)
Moments and Expectations
343(1)
Stationary Processes
344(3)
Ergodic Processes
347(2)
Correlation Coefficients and Correlation Functions
349(3)
Example of a Normal Stochastic Process
352(1)
Examples of Computation of Correlation Functions
353(3)
Some Elementary Properties of Correlation Functions of Stationary Processes
356(1)
Power Spectra and Correlation Functions
357(2)
Response of Linear Lumped Systems to Ergodic Excitation
359(4)
Stochastic Limits and Convergence
363(2)
Stochastic Differentiation and Integration
365(2)
Gaussian-process Example of a Stationary Process
367(1)
The Over-all Mathematical Structure of the Stochastic Processes
368(2)
A Relation between Positive Definite Functions and Theory of Probability
370(4)
Communication under Stochastic Regimes
Stochastic Nature of Communication
374(2)
Finite Markov Chains
376(1)
A Basic Theorem on Regular Markov Chains
377(3)
Entropy of a Simple Markov Chain
380(4)
Entropy of a Discrete Stationary Source
384(4)
Discrete Channels with Finite Memory
388(1)
Connection of the Source and the Discrete Channel with Memory
389(2)
Connection of a Stationary Source to a Stationary Channel
391(7)
Part 4: Some Recent Developments
The Fundamental Theorem of Information Theory
Preliminaries
A Decision Scheme
398(1)
The Probability of Error in a Decision Scheme
398(2)
A Relation between Error Probability and Equivocation
400(2)
The Extension of Discrete Memoryless Noisy Channels
402(1)
Feinstein's Proof
On Certain Random Variables Associated with a Communication System
403(2)
Feinstein's Lemma
405(1)
Completion of the Proof
406(3)
Shannon's Proof
Ensemble Codes
409(3)
A Relation between Transinformation and Error Probability
412(2)
An Exponential Bound for Error Probability
414(2)
Wolfowitz's Proof
The Code Book
416(1)
A Lemma and Its Application
417(2)
Estimation of Bounds
419(2)
Completion of Wolfowitz's Proof
421(3)
Group Codes
Introduction
424(1)
The Concept of a Group
425(3)
Fields and Rings
428(1)
Algebra for Binary n-Digit Words
429(2)
Hamming's Codes
431(4)
Group Codes
435(2)
A Detection Scheme for Group Codes
437(1)
Slepian's Technique for Single-error Correcting Group Codes
438(4)
Further Notes on Group Codes
442(4)
Some Bounds on the Number of Words in a Systematic Code
446(4)
APPENDIX Additional Notes and Tables
N-1. The Gambler with a Private Wire
450(2)
N-2. Some Remarks on Sampling Theorem
452(2)
N-3. Analytic Signals and the Uncertainty Relation
454(3)
N-4. Elias's Proof of the Fundamental Theorem for BSC
457(3)
N-5. Further Remarks on Coding Theory
460(2)
N-6. Partial Ordering of Channels
462(2)
N-7. Information Theory and Radar Problems
464(1)
T-1. Normal Probability Integral
465(1)
T-2. Normal Distributions
466(1)
T-3. A Summary of Some Common Probability Functions
467(1)
T-4. Probability of No Error for Best Group Code
468(1)
T-5. Parity-check Rules for Best Group Alphabets
469(2)
T-6. Logarithms to the Base 2
471(5)
T-7. Entropy of a Discrete Binary Source
476(5)
Bibliography 481(10)
Name Index 491(2)
Subject Index 493

An electronic version of this book is available through VitalSource.

This book is viewable on PC, Mac, iPhone, iPad, iPod Touch, and most smartphones.

By purchasing, you will be able to view this book online, as well as download it, for the chosen number of days.

Digital License

You are licensing a digital product for a set duration. Durations are set forth in the product description, with "Lifetime" typically meaning five (5) years of online access and permanent download to a supported device. All licenses are non-transferable.

More details can be found here.

A downloadable version of this book is available through the eCampus Reader or compatible Adobe readers.

Applications are available on iOS, Android, PC, Mac, and Windows Mobile platforms.

Please view the compatibility matrix prior to purchase.