Quick example: I like two jobs equally, one offers $1 a month and the other $1.20 per month. Which should I take (assuming they're the same in every other way?) Ok, that was too easy.

How about this, I could pay N100 to enter a lottery that pays out N1000 to one winner. Should I do it?

In maths actually, when things are uncertain, you think of "expected values." In this example, if you're spending N100, what do you EXPECT to receive? Mathematically, you multiply the amount by the PROBABILITY to get the EXPECTED AMOUNT.

Answers: If I have a 50-50 chance of winning (like if only two people entered the lottery) then yes, go for it! On the other hand, if I have a 1 in 100 chance of winning, then I should just keep my money and wait for a better opportunity.

Why? In the first case, the expected payoff is big, you expect to get N1000 * 50% = N500 for spending only N100

but

in the second case, the payoff (expected) is negative since you expect to get N1000 * 1/100 = N10 only, which is less than the N100 you spent.

The gray area: sometimes a positive payoff may not be worth it (but that's a question of psychology or other factors, which you can learn to include in your calculations as well). If you think about it, this DECISION THEORY stuff can help you make many different kinds of decisions in a way that is rational.

This is the

**JULY 2011 blog post**- I'm still playing catch-up