Information Theory: A Concise Introduction
Stefan Hollos and J. Richard Hollos
Format and pricing: Paperback (136 pages) $14.95, Kindle/pdf $9.95
ISBN: 9781887187282 (paperback), 9781887187299 (ebook)
Publication date: May 2015
Books on information theory tend to fall into one of two extreme categories. There are large academic textbooks that cover the subject with great depth and rigor. Probably the best known of these is the book by Cover and Thomas. At the other extreme are the popular books such as the ones by Pierce and Gleick. They provide a very superficial introduction to the subject, enough to engage in cocktail party conversation but little else. This book attempts to bridge these two extremes.
This book is written for someone who is at least semi-mathematically literate and wants a concise introduction to some of the major concepts in information theory. The level of mathematics needed is very elementary. A rudimentary grasp of logarithms, probability, and basic algebra is all that is required. Two chapters at the end of the book provide a review of everything the reader needs to know about logarithms and discrete probability to get the most out of the book. Very little attention is given to mathematical proof. Instead the results are presented in a way that makes them almost obvious or at least plausible.
The book will appeal to anyone looking for a fast introduction to most of the major topics in information theory. An introduction that is concise but not superficial.
About the authors
Stefan Hollos and J. Richard Hollos are physicists and electrical engineers by training, and enjoy anything related to math, physics, engineering and computing. They are brothers and business partners at Exstrom Laboratories LLC in Longmont, Colorado.
Table of Contents
- Preface
- Introduction
- Number guessing game
- Counterfeit coins
- Encoding Messages
- Nonuniform probabilities
- Kraft-McMillan inequality
- Average code word length
- Huffman Coding
- Arithmetic Coding
- Entropy
- Entropy of a Markov Chain
- Principle of maximum entropy
- Entropy of English
- Channel Capacity
- Channel capacity and gambling
- Error Correction Coding
- Repetition codes
- Parity check codes
- Hamming codes
- Supplementary Material
- Review of logarithms
- Review of Discrete Probability
- References and Further Reading
- Acknowledgements
- About the Authors
Software
This software is free and distributed under the terms of the GNU General Public License. It is written in ANSI C and should compile with any C compiler.
- huffman.c
Generates a Huffman code for the symbols si with probability pi.
Usage: huffman s1 p1 s2 p2 ... sn pn
Send comments to: Richard Hollos (richard[AT]exstrom DOT com)
Copyright 2022 by Exstrom Laboratories LLC