# Chapter 20 Information Theory

In this chapter we introduce the theoretical concepts behind the security of a cryptosystem. The basic question is the following: If Eve observes a piece of ciphertext, does she gain any new information about the encryption key that she did not already have? To address this issue, we need a mathematical definition of information. This involves probability and the use of a very important measure called entropy.

Many of the ideas in this chapter originated with Claude Shannon in the 1940s.

Before we start, let’s consider an example. Roll a standard six-sided die. Let  be the event that the number of dots is odd, and let  be the event that the number of dots is at least 3. If someone tells you that the roll belongs to the event  , then you know that there are only two possibilities for what the roll is. In this sense,  tells you more about the value of the roll than just the event , or just the event . In this sense, the information contained in the event  is larger than the information just in  or just in .

The idea of information is closely linked with the idea of uncertainty. Going back to the example of the die, if you are told that the event  happened, you become less uncertain about what the value of the roll was than if you are simply told that event  occurred. Thus the information increased while the uncertainty decreased. Entropy provides a measure of the increase in information or the decrease in uncertainty provided by the outcome of an experiment.