Law of total expectation
The proposition in probability theory known as the law of total expectation,[1] the law of iterated expectations, the tower rule, the smoothing theorem, and Adam's Law among other names, states that if X is an integrable random variable (i.e., a random variable satisfying E( | X | ) < ∞) and Y is any random variable, not necessarily integrable, on the same probability space, then
i.e., the expected value of the conditional expected value of X given Y is the same as the expected value of X.
The conditional expected value E( X | Y ) is a random variable in its own right, whose value depends on the value of Y. Notice that the conditional expected value of X given the event Y = y is a function of y. If we write E( X | Y = y) = g(y) then the random variable E( X | Y ) is just g(Y).
One special case states that if is a partition of the whole outcome space, i.e. these events are mutually exclusive and exhaustive, then
Example
Suppose that two factories supply light bulbs to the market. Factory X's bulbs work for an average of 5000 hours, whereas factory Y's bulbs work for an average of 4000 hours. It is known that factory X supplies 60% of the total bulbs available. What is the expected length of time that a purchased bulb will work for?
Applying the law of total expectation, we have:
where
- is the expected life of the bulb;
- is the probability that the purchased bulb was manufactured by factory X;
- is the probability that the purchased bulb was manufactured by factory Y;
- is the expected lifetime of a bulb manufactured by X;
- is the expected lifetime of a bulb manufactured by Y.
Thus each purchased light bulb has an expected lifetime of 4600 hours.
Proof in the discrete case
Proof in the general case
The general statement of the result makes reference to a probability space on which two sub -algebras are defined. For a random variable on such a space, the smoothing law states that
Since a conditional expectation is a Radon–Nikodym derivative, verifying the following two properties establishes the smoothing law:
The first of these properties holds by the definition of the conditional expectation, and the second holds since implies
In the special case that and , the smoothing law reduces to the statement
Notation without indices
When using the expectation operator , adding indices to the operator may lead to cumbersome notations and these indices are often omitted. In the case of iterated expectations stands for . The innermost expectation is the conditional expectation of given , and the outermost expectation is taken with respect to the conditioning variable . This convention is notably used in the rest of this article.
Iterated expectations with nested conditioning sets
The following formulation of the law of iterated expectations plays an important role in many economic and finance models:
where the value of I2 is determined by that of I1. To build intuition, imagine an investor who forecasts a random stock price X based on the limited information set I1. The law of iterated expectations says that the investor can never gain a more precise forecast of X by conditioning on more specific information (I2), if the more specific forecast must itself be forecast with the original information (I1).
This formulation is often applied in a time series context, where Et denotes expectations conditional on only the information observed up to and including time period t. In typical models the information set t + 1 contains all information available through time t, plus additional information revealed at time t + 1. One can then write:[2]
See also
- The fundamental theorem of poker for one practical application.
- Law of total probability
- Law of total variance
- Law of total covariance
- Product distribution#expectation (application of the Law for proving that the product expectation is the product of expectations)
References
- ↑ Weiss, Neil A. (2005). A Course in Probability. Boston: Addison–Wesley. pp. 380–383. ISBN 0-321-18954-X.
- ↑ Ljungqvist, Lars; Sargent, Thomas J. (2004). Recursive Macroeconomic Theory. Cambridge: MIT Press. pp. 401–402. ISBN 0-262-12274-X.
- Billingsley, Patrick (1995). Probability and measure. New York: John Wiley & Sons. ISBN 0-471-00710-2. (Theorem 34.4)
- Christopher Sims, "Notes on Random Variables, Expectations, Probability Densities, and Martingales", especially equations (16) through (18)