In probability, we often have events that have a variety of possible numerical outcomes. For example, when a die is rolled, the possible numerical outcomes are the integers through When certain games are played, it is possible to obtain certain scores each with potentially different probabilities. A quantity called the expectation value can be used to determine what the average output value might be after many, many trials.

The expectation value may not be a value that could be obtained through any single trial, but does represent an average value that would be obtained, which is dependent on both the values themselves as well as the probabilities with which each of the values might be obtained.

Below, I am providing the definition of an expectation value, but this may not be fully clear until you have looked at one of the examples that follow.

Definition of an Expectation Value

Consider an event that has particular outcome values each with associated probabilities , then the expectation value of the event is given as follows.

If there are different output values, then this can also be written more compactly by means of summation notation.

Example

Consider a game with possible scores and probabilities of each score as given by the table shown below.

Score Probability
110%
215%
320%
425%
530%

The expected value of or expected score in this context would be given by

Despite being the “middle score” out of the possible attainable scores, the expectation value here is slightly larger than reflecting the fact that higher scores are more likely to be obtained when playing this game.

Practice Problem

Problem

You roll a standard six-sided die and flip a coin. If the coin lands on heads, then the value shown on the die is your score. If the coin lands tails, then your score is

  • double the value shown on the die if the value shown on the die is even or
  • zero if the value shown on the die is odd.

What is the expected score obtained?