This is “Expected Value”, section 17.7 from the book Theory and Applications of Microeconomics (v. 1.0).

For more information on the source of this book, or why it is available for free, please see the project's home page. You can browse or download additional books there. You may also download a PDF copy of this book (30 MB) or just this chapter (3 MB), suitable for printing or most e-readers, or a .zip file containing this book's HTML files (for use in a web browser offline).

Has this book helped you? Consider passing it on:

Creative Commons supports free culture from music to education. Their licenses helped make this book available to you.

DonorsChoose.org helps people like you help teachers fund their classroom projects, from art supplies to books to calculators.

Probability is the percentage chance that something will occur. For example, there is a 50 percent chance that a tossed coin will come up heads. We say that the probability of getting the *outcome* “*heads*” is 1/2. There are five things you need to know about probability:

- The list of possible outcomes must be complete.
- The list of possible outcomes must not overlap.
- If an outcome is certain to occur, it has probability 1.
- If an outcome is certain not to occur, it has probability 0.
- If we add together the probabilities for all the possible outcomes, the total must equal 1.

The expected value of a situation with financial risk is a measure of how much you would expect to win (or lose) on average if the situation were to be replayed a large number of times. You can calculate expected value as follows:

- For each outcome, multiply the probability of that outcome by the amount you will receive.
- Add together these amounts over all the possible outcomes.

For example, suppose you are offered the following proposal. Roll a six-sided die. If it comes up with 1 or 2, you get $90. If it comes up 3, 4, 5, or 6, you get $30. The expected value is

(1/3) × $90 + (2/3) × $30 = $50.Most people dislike risk. They prefer a fixed sum of money to a gamble that has the same expected value. *Risk aversion* is a measure of how much people want to avoid risk. In the example we just gave, most people would prefer a sure $50 to the uncertain proposal with the expected value of $50.

Suppose we present an individual with the following gamble:

- With 99 percent probability, you lose nothing.
- With 1 percent probability, you lose $1,000.

The expected value of this gamble is −$10. Now ask the individual how much she would pay to avoid this gamble. Someone who is risk-neutral would be willing to pay only $10. Someone who is risk-averse would be willing to pay more than $10. The more risk-averse an individual, the more the person would be willing to pay.

The fact that risk-averse people will pay to shed risk is the basis of insurance. If people have different attitudes toward risky gambles, then the less risk-averse individual can provide insurance to the more risk-averse individual. There are gains from trade. Insurance is also based on diversification, which is the idea that people can share their risks so it is much less likely that any individual will face a large loss.

- Expected value is the sum of the probability of an event times the gain/loss if that event occurs.
- Risk-averse people will pay to avoid risk. This is the basis of insurance.

Consider a gamble where there are three and only three possible outcomes (*x*, *y*, *z*) that occur with probabilities *Pr(x)*, *Pr(y)*, and *Pr(z)*. Think of these outcomes as the number of dollars you get in each case. First, we know that

Second, the expected value of this gamble is