Note: You are viewing an old revision of this page. View the current version.
See definition at PrincipiaCybernetica.
Here is a fairly simple example of analyzing a decision under uncertainty. Say you are given the opportunity to play a street game. The cost of playing is $10. You draw a card from a deck and the payoff depends on what card you draw. If you draw a black card you get nothing. If you draw a heart, you get $10 (your money back). If you draw diamond you get $20 (double your money back). The decision is whether or not to play.
To apply decision theory to the question of whether or not to play, you calculate the expected payoffs of both the alternatives. The expected payoff of playing the game is equal to the payoffs of the possibilities multiplied by the probability of the outcome. There are 3 possible outcomes of playing the game:
The expected payoff of the game is the sum of the calculations above, minus the cost of the game, i.e. ($0.00 + $2.50 + $5.00 - $5.00 = $2.50)
Now compare that to the expected payoff of not playing the game ($0, no risk, no reward), and the expected payoff playing the game ($2.50) is higher, therefore you should play the game (assuming, of course, that you want to maximize your expected payoff).
See other InterestingMemes.