Do you think the probability of a split jackpot should be factored into the calculation of the expected value of lottery tickets? If so, what is that probability?

rdw4potus

I do indeed think that is a factor that should be considered, albeit a somewhat minor one, in the decision to purchase a lottery ticket. To answer your question, I used the jackpot amount and sales figures found at lottoreport.com. I looked at the Powerball going back to Jan. 2008, because that is as far back as that website has data. I also looked at Mega Millions going back to June 2005, a point at which there was a change in the rules. The following table summarizes my results.

### Split Jackpots in Powerball and Mega Millions

 Item Powerball Mega Millions Probability of winning jackpot 1 in 195,249,054 1 in 175,711,536 Average jackpot offered \$73,569,853 \$65,792,976 Average sales per draw \$23,051,548 \$25,933,833 Average expected winners per draw 0.118 0.148 Average probability of a split jackpot per draw 0.74% 1.29% Loss in return due to shared jackpots (unadjusted) 4.01% 6.59% Loss in return due to shared jackpots (adjusted) 1.41% 2.31%

So the average probability that a jackpot will be split is 0.74% in Powerball and 1.29% in Mega Millions. As the jackpot gets higher, and sales go up, so does the probability of splitting the jackpot. The reason the split jackpot probability is higher in Mega Millions is because the probability of winning is greater and there is more competition from other players.

All things considered, I show 4.01% is lost in return due to jackpot sharing in Powerball and 6.59% is lost in Mega Millions. However, those figures don’t account for taxes, or that jackpots are paid in the form of an annuity. To adjust for that, I assumed the player gets only half of it, either by taking the lump sum option or the loss in value from choosing the annuity. I also assumed that 30% of the rest is lost in taxes, so the winner can expect to receive 35% after both factors. After that adjustment, I show a 1.20% loss in return due to jackpot sharing in the Powerball and 1.98% in Mega Millions.

This question was raised and discussed in the forum of my companion site Wizard of Vegas.

What is the probability of a tie hand in pai gow poker in the front, back, and both at the same time?

Cary L.

Based on a simulation of 7.7 billion hands, assuming the player follows the house way, the probability of a tie in the front (low) hand is 2.55%, or 1 in 39.24. The probability of a tie in the back (high) hand is 0.038%, or 1 in 2,637. The probability of a double tie is about 1 in 78,200.

The Rule of 72 states that you divide an annual rate of return into 72, and that gives you the number of years it will take to double your money. For instance, an investment that pays 10% annually will take 72/10=7.2 years to double in value. My somewhat idle question is, why 72?

mkl654321

First, the "rule of 72" is an approximation of the time needed to double your money, not an exact answer. The following table shows the "rule of 72" values and the exact number of years, for various annual interest rates.

### Rule of 72 — Years to Double Money

Interest Rate Rule of 72 Exact Difference
0.01 72.00 69.66 2.34
0.02 36.00 35.00 1.00
0.03 24.00 23.45 0.55
0.04 18.00 17.67 0.33
0.05 14.40 14.21 0.19
0.06 12.00 11.90 0.10
0.07 10.29 10.24 0.04
0.08 9.00 9.01 -0.01
0.09 8.00 8.04 -0.04
0.10 7.20 7.27 -0.07
0.11 6.55 6.64 -0.10
0.12 6.00 6.12 -0.12
0.13 5.54 5.67 -0.13
0.14 5.14 5.29 -0.15
0.15 4.80 4.96 -0.16
0.16 4.50 4.67 -0.17
0.17 4.24 4.41 -0.18
0.18 4.00 4.19 -0.19
0.19 3.79 3.98 -0.20
0.20 3.60 3.80 -0.20

Why 72? It doesn’t have to be exactly 72. That is just the number that works out well for realistic interest rates you’re likely to see on an investment. It works out almost exactly for an interest rate of 7.8469%. There is nothing special about 72, like there is about π or e. Why does any number work? If the interest rate is i, then let’s solve for the number of years (y) it takes to double an investment.

2 = (1+i)y
ln(2)= ln(1+i)y
ln(2)= y×ln(1+i)
y = ln(2)/ln(1+i)

This may not be my best answer ever, but try to follow this logic: let y=ln(x).
dy/dx=1/x.
1/x =~ x at values of x close to 1.
So the dy/dx =~ 1 for values of x close to 1.
So the slope of ln(x) is going to be close to 1 for values of x close 1.
So the slope of ln(1+x) is going to be close to 1 for values of x close 0.
The "rule of 72" is saying that .72/i =~ .6931/ln(1+i).
We’ve established that i and ln(1+i) are similar for values of i close to 0.
So 1/i and 1/ln(1+i) are similar for values of i close to 0.
Using 72 instead of 69.31 adjusts for differences between i and ln(1+i) for values of i around 8%.

I hope that makes some sense. My calculus is rather rusty; it took hours to explain this to myself.

This question was raised and discussed in the forum of my companion site Wizard of Vegas.

A man is presented with two envelopes full of money. One of the envelopes contains twice the amount as the other envelope. Once the man has chosen his envelope, opened and counted it, he is given the option of changing it for the other envelope. The question is, is there any gain to the man in changing the envelope?

It would appear that by switching the man would have a 50% chance of doubling his money should the initial envelope be the lesser amount and a 50% chance of halving it if the initial envelope is the higher amount. Thus, let x be the amount contained in the initial envelope and y be the value of changing it:

y = 0.5×(x/2) + 0.5×(2x) = 1.25x

Let’s say that the initial envelope contained \$100. So there should be a 50% chance that the other envelope contains 2 × \$100 = \$200 and a 50% chance that the other envelope contains (1/2) × \$100 = \$50. In such a case, the value of the envelope is:

0.5×(\$100/2) + 0.5×(2×\$100) = \$125

This implies that the man would, on average, increase his wealth by 25% simply by switching envelopes! How can this be?

DorothyGale

This appears to be a mathematical paradox, but is really just an abuse of the expected value formula. As you noted in the question, it seems like the other envelope should have 25% more than the one you chose. However, if you buy that, then you may as well pick the other envelope to begin with. Furthermore, you could use that argument to switch back and forth forever if you don’t get to open the envelopes before deciding to switch. Clearly there must be some flaw in the expected value argument. The question is, where is the flaw?

I have spent a lot of time reading about this problem and discussing it over the years. I’ve heard and read many explanations about why the y=.5x + .5*2x = 1.25x argument is wrong. Many have used many pages of advanced mathematics in the explanation, which I don’t think is necessary. It is a simple question that calls out for a simple answer. So, this is my crack at it.

You must be very careful with what you do with the stated fact that one envelope has twice the money as the other one. Let’s call the amount in the smaller envelope S, and the larger one L. So we have:

L=2×S
S=0.5×L

Notice how the 2 and 0.5 factors are applied to different envelopes. You can’t take both factors and apply them to the same amount. If the first envelope has \$100 then if it was the smaller envelope, the other one will have \$200. If the \$100 was the larger envelope, then the other one will have \$50. So the other envelope will have \$50 or \$200. However, you can’t jump from there to say there is a 50/50 chance of each. This is because that would be applying the 0.5 and 2 factors to the same amount, which you can’t do. Without knowing the prize distribution to begin with, you can’t assign possible amounts to the second envelope.

If the 0.5x/2x argument is wrong, then what would be the correct way to set up the expected value of the other envelope? The way I would do it is to say that the difference between the two envelopes is L-S = 2S-S = S. By switching you’ll either gain or lose S, whatever it is. If the two envelopes have \$50 and \$100, then switching will gain or lose \$50. If the two envelopes have \$100 and \$200, then switching will gain or lose \$100. Either way, the expected gain by switching is 0. I think I could say that if the first envelope has \$100, then there is a 50% chance the difference in the other envelope is \$50, and a 50% chance it is \$100. So the expected difference is \$75. Thus, the expected value of the other envelope is 0.5×(\$100+\$75) + 0.5×(\$100-\$75) = 0.5×(\$175+\$25) = \$100.

I hope that makes some sense. This problem always induces lots of comments. If you have one, please don’t write to me directly, but post it in my Wizard of Vegas forum. The link is below.

This question was raised and discussed in the forum of my companion site Wizard of Vegas.