There is a very fundamental aspect to probability theory. The most accurate and complete description that we have of the physical world is called quantum mechanics. The most famous equation in quantum mechanics is called the Schrodinger equation which allows you to solve for something called a wave function. If you take the square magnitude of a wave function you get a probability distribution. These probability distributions are in a sense very real. They can for example predict how atoms bond to each other to form chemical compounds. At its most fundamental level, chemistry is based on probability distributions.

Probability has many uses beyond physics and chemistry. It is used in computer science, engineering, biology, finance, games of chance, and in everyday life. It is considered by some to be a form of generalized logic (see for example E. T. Jaynes, Probability Theory: The Logic of Science). Probability theory is one of the most applied areas of mathematics and learning how to solve probability problems is a skill that can be used in many areas.

We enjoy solving probability problems. Why? Because there is an almost endless variety of interesting and stimulating problems to choose from. If you like puzzles then you will enjoy solving these problems. Not only is it just plain intellectual fun but it will sharpen your reasoning and problem solving skills.

This book deals primarily with discrete probability problems. These are problems that can be solved with very little mathematical background. A good grasp of algebra is required and some previous exposure to probability or combinatorics is helpful. We have included sections that review the basics of discrete probability and combinatorics. If you are already familiar with these then you can go directly to the problems.

Also included are sections on more advanced topics in discrete probability that are helpful in solving some of the more difficult and interesting problems. There is a section on how to calculate probabilities when you have a set of events that are nonexclusive, meaning that more than one of the events can occur simultaneously. There is a section on solving Bayesian type problems. These are problems where the probability of a hypothesis is calculated based on some evidence. A simple example is the hypothesis that a die is loaded with the evidence being the result of rolling the die repeatedly. There is a section on solving collection problems. A simple example of this kind of problem is calculating the probability that it takes more than 10 rolls of a die to get all six faces at least once. Another example is calculating the average number of boxes of cereal you need to buy to collect all the different prizes inside. The final introductory section is on solving run problems. The canonical example here is calculating the average number of times you need to toss a coin to get n heads in a row.

After the introductory sections come the problems. They generally increase in difficulty as you go and are grouped together somewhat by type. The first few are very easy and you should consider them warmups for what's to come. Some of the later problems are very challenging and will require significant work to solve. Many of the problems are modernized versions of the problems and exercises found in the book: Choice and Chance by W. A. Whitworth (see Further Reading at the end of the book). Happy problem solving.

This book's web page is:

We can be reached by email at:
stefan[at]exstrom DOT com      richard[at]exstrom DOT com

Stefan Hollos and J. Richard Hollos
Exstrom Laboratories LLC
Longmont, Colorado, U.S.A.
April 2013