TV Game Shows - FAQ

Reason #3 why the Wizard likes Bovada: Excellent Odds

In my opinion many online casinos are too stingy when setting the odds on their games. They think they will make more money that way but I believe they are misguided, because when players lose too quickly it's not fun, and those players might not come back.

Bovada is one of the few casinos that understands this. They offer generous odds to let you play longer and get you a better chance of winning. Among their generous offerings are Full-Pay Jacks or Better returning 99.54%, five other video poker games paying over 99%, single-zero roulette, and my favorite, Double Jackpot Poker, returning 99.63%!

Kudos to Bovada for not being afraid to give their players a good gamble.

Visit Bovada

Its not exactly gambling but I always wondered on the Price is Right gameshow what the best strategy to take when spinning the big wheel when you are not last to spin. Assuming you can’t control your spin (completely random outcome), 5 cent increments from $.05 to $1.00, you get one spin or two spins added together, you can’t go over 1.00. At what amount should you not take your second spin so that you can have the best chance to beat the player who spins after you?

Tony C.

The first player should spin again if his first spin is 65 cents or less.

If any of the following conditions are true the second player should spin again.

  1. His score is less than the first player’s score.
  2. His score is 50 cents or less.
  3. His score is 65 cents or less and he has tied the first player.
The third player should spin again if his score is less than the current highest score. If his first spin ties the highest score he should spin again if the tie is at 45 cents or less. If the tie is at 50 cents he has an equal chance of winning by spinning again or not. If the there is a three way tie after the first spin of the third player he should spin again if the tie is at 65 cents or less.

What is the optimal strategy for the Plinko game on the Price is Right?

Anonymous

From left to right the prizes are $100, $500, $1000, $0, $10000, $0, $1000, $500, $100. I would need to know the exact configuration of pegs on the board to do a perfect analysis but just eyeballing the board (see link above) I strongly feel the player should drop the puck directly over the $10,000 prize. Although it is bordered by two zeros all other prizes pale in comparison to the top prize. So the player’s strategy should be to maximize the probability of the top prize by dropping it directly above. To confirm to deny my hypothesis I did a search and there are lots of links devoted to the study of this game. This (www.amstat.org/publications/jse/v9n3/biesterfeld.html) is one of the better ones, which agrees with my conclusion. It states in part that the expected value of dropping the puck in the middle is $2557.91, on either side of the middle is $2265.92, and tapering off as you move away further from the center.

On the game show Let’s Make a Deal, there are three doors. For the sake of example, let’s say that two doors reveal a goat, and one reveals a new car. The host, Monty Hall, picks two contestants to pick a door. Every time Monty opens a door first that reveals a goat. Let’s say this time it belonged to the first contestant. Although Monty never actually did this, what if Monty offered the other contestant a chance to switch doors at this point, to the other unopened door. Should he switch?

Anonymous

Yes! The key to this problem is that the host is predestined to open a door with a goat. He knows which door has the car, so regardless of which doors the players pick, he always can reveal a goat first. The question is known as the "Monty Hall Paradox." Much of the confussion about it is because often when the question is framed, it is incorrectly not made clear the host knows where the car is, and always reveals a goat first. I think put some of the blame on Marilyn Vos Savant, who framed the question badly in her column. Let’s assume that the prize is behind door 1. Following are what would happen if the player (the second contestant) had a strategy of not switching.

  • Player picks door 1 --> player wins
  • Player picks door 2 --> player loses
  • Player picks door 3 --> player loses

Following are what would happen if the player had a strategy of switching.

  • Player picks door 1 --> Host reveals goat behind door 2 or 3 --> player switches to other door --> player loses
  • Player picks door 2 --> Host reveals goat behind door 3 --> player switches to door 1 --> player wins
  • Player picks door 3 --> Host reveals goat behind door 2 --> player switches to door 1 --> player wins

So by not switching the player has 1/3 chance of winning. By switching the player has a 2/3 chance of winning. So the player should definitely switch.

For further reading on the Monty Hall paradox, I recommend the article at Wikipedia.

My question is about a problem that is known as the "two envelope paradox". You are on a game show. In front of you are 2 envelopes, each containing an unknown amount of cash. You are told that 1 envelope has twice as much money as the other. You are now asked to choose an envelope. You choose one. It contains $50,000. Now you are told that you can keep the envelope you picked, or swap for the other one. Should you swap? Knowing ahead of time that you could swap, then it doesn’t matter, as you would just choose the envelope you ultimately want. But because you only find out about swapping after you choose an envelope, then the original selection and the option to swap are 2 independent events, correct? That said, when deciding to swap or not, the other envelope contains either twice as much or half as much as what you currently have. So it has either $100K or $25k. Since there is a 50% chance of either occurring, the Expected Value of the other envelope is $62,500. Generically speaking, if we let x = the amount you originally selected, then the other envelope's EV is 1.25x. Therefore it is always correct to swap. Is this correct? Thank you.

Derek from Boston

I'm very familiar with this problem. I address it on my web site of math problems, problem number 6. There I address the general case, including not looking in the first envelope at all. However to answer your question we can not ignore the venue of where the game is taking place. You said it was a "game show." On most game shows $50,000 is a nice win. Few contestants on the Price is Right ever make it that high. I would guess that fewer than 50% of players on Who Wants to be a Millionaire get that high. Meanwhile wins of $25,000 are not unusual on game shows. Cars are won routinely on the Price is Right, which have values of about $25,000. The $32,000 level is a common win on Who Wants to be a Millionaire. The average win on Jeopardy per show is roughly $25,000. The great Ken Jennings averaged only $34,091 over his 74 wins. So, my point is that $50,000 is a nice win for a game show, and $100,000 wins are seen much less often that $25,000. Thus as a game show connoisseur it is my opinion that the other envelope is more likely to have $25,000 than $100,000. So I say in your example it is better to keep the $50,000. It also goes to show you can never assume the chances that the other envelope has half as much or twice as much are exactly 50/50. Once you see the amount and put it in the context of the venue it is being played you can make an intelligent decision on switching, which throws the 1.25x argument out the window.

My question is regarding the game show, Deal or No Deal, very popular in Australia and about to come to England. The contestant is shown twenty-six numbered briefcases, each containing a hidden amount of money, ranging from 50 cents to $200,000 as below.

  1. $0.50
  2. $1
  3. $2
  4. $5
  5. $10
  6. $25
  7. $50
  8. $75
  9. $100
  10. $150
  11. $250
  12. $500
  13. $750
  14. $1,000
  15. $1,500
  16. $2,000
  17. $3,000
  18. $5,000
  19. $7,500
  20. $10,000
  21. $15,000
  22. $30,000
  23. $50,000
  24. $75,000
  25. $100,000
  26. $200,000

The contestant selects one of the briefcases to be THEIR suitcase. Through a process of elimination, by opening the other suitcases, they try and work out how much money is their case, or whether it would be wiser to take a "Bank offer". The Bank Offers are based on, but not equivalent to, the arithmetic mean of the remaining briefcases. So, if there are mainly large valued briefcases remaining, there is a high chance that the contestant’s briefcase is valuable, and so the Bank Offer will be generous. Conversely, if the player has been less fortunate and opened the more valuable briefcases, then the Bank Offer will be low. What would be the best strategy to employ if you were a contestant on this game? A non-mathematical gut instinct strategy would be ignore the Bank offers and carry on opening cases until either the $200,000 was opened and eliminated, or both the $100,000 and $75,000 were opened and eliminated. What’s the math behind this game, Wizard?

Jacqui from Birmingham, England

Deal or No Deal just started here in the U.S.. The rules sound the same except our prizes go up to a million dollars as follows.

  1. 0.01
  2. 1
  3. 5
  4. 10
  5. 25
  6. 50
  7. 75
  8. 100
  9. 200
  10. 300
  11. 400
  12. 500
  13. 750
  14. 1000
  15. 5000
  16. 10000
  17. 25000
  18. 50000
  19. 75000
  20. 100000
  21. 200000
  22. 300000
  23. 400000
  24. 500000
  25. 750000
  26. 1000000

Here is the flow of the game:

  1. Player picks one case for himself
  2. Player opens up six of the remaining 25 cases.
  3. Banker makes an offer.
  4. If player declines he opens five more of the 19 remaining cases.
  5. Banker makes an offer.
  6. If player declines he opens four more of the 14 remaining cases.
  7. Banker makes an offer.
  8. If player declines he opens three more of the 10 remaining cases.
  9. Banker makes an offer.
  10. If player declines he opens two more of the 7 remaining cases.
  11. Banker makes an offer.
  12. If player declines he opens one more of the remaining cases.
  13. Keep repeating steps 11 and 12 until player accepts an offer or player has the last unopened case.

I have watched three games in their entirety. The following table shows the remaining prizes, average amount, and banker offer of what I will call "game 1."

Deal or No Deal Game 1Expand

Initial Stage 1 Stage 2 Stage 3 Stage 4 Stage 5 Stage 6 Stage 7
$0.01 $0.01 $0.01
$1 $1 $1
$5 $5 $5 $5 $5
$10 $10 $10 $10 $10 $10 $10 $10
$25 $25 $25 $25
$50 $50
$75 $75 $75 $75 $75 $75 $75 $75
$100 $100 $100
$200 $200 $200 $200 $200
$300
$400 $400 $400 $400
$500 $500 $500 $500 $500 $500 $500
$750
$1000 $1000
$5000 $5000 $5000 $5000 $5000 $5000 $5000 $5000
$10000
$25000 $25000
$50000
$75000 $75000
$100000
$200000
$300000 $300000 $300000 $300000
$400000 $400000 $400000
$500000 $500000 $500000 $500000 $500000 $500000
$750000 $750000
$1000000 $1000000 $1000000 $1000000 $1000000 $1000000 $1000000 $1000000
Expected Value $152868 $147088 $164201 $188224 $250931 $201117 $251271
Offer $13000 $27000 $48000 $76000 $124000 $121000 $201000

The following chart plots the player’s expected value and the banker’s offer.

Next are the same table and charts for games 2 and 3.

Deal or No Deal Game 2Expand

Initial Stage 1 Stage 2 Stage 3 Stage 4 Stage 5 Stage 6 Stage 7 Stage 8 Stage 9
$0.01
$1 $1
$5 $5 $5 $5 $5 $5 $5 $5 $5
$10 $10
$25
$50 $50 $50 $50 $50 $50 $50 $50 $50 $50
$75 $75 $75
$100 $100 $100 $100 $100
$200 $200 $200 $200 $200 $200 $200 $200
$300 $300 $300 $300
$400
$500
$750 $750
$1000 $1000 $1000
$5000 $5000 $5000 $5000 $5000 $5000 $5000
$10000 $10000 $10000 $10000 $10000 $10000
$25000 $25000 $25000
$50000 $50000 $50000 $50000
$75000 $75000 $75000
$100000 $100000
$200000 $200000 $200000 $200000 $200000 $200000 $200000 $200000 $200000 $200000
$300000
$400000 $400000 $400000 $400000
$500000 $500000 $500000 $500000 $500000
$750000
$1000000 $1000000
Expected Value $118375 $84449 $105969 $89419 $35876 $41051 $50064 $66685 $100025
Offer $11000 $19000 $37000 $32000 $25000 $37000 $50000 $67000 $99000

Deal or No Deal Game 3Expand

Initial Stage 1 Stage 2 Stage 3 Stage 4 Stage 5 Stage 6 Stage 7 Stage 8
$0.01
$1 $1
$5 $5 $5
$10 $10 $10
$25 $25 $25 $25 $25 $25
$50 $50 $50 $50 $50 $50 $50 $50 $50
$75 $75 $75 $75
$100 $100 $100 $100 $100
$200 $200 $200 $200 $200 $200 $200 $200 $200
$300 $300 $300 $300 $300 $300 $300
$400 $400
$500
$750
$1000 $1000 $1000 $1000
$5000 $5000 $5000
$10000
$25000 $25000 $25000 $25000 $25000
$50000 $50000
$75000
$100000 $100000
$200000 $200000
$300000
$400000 $400000 $400000
$500000 $500000 $500000 $500000 $500000 $500000 $500000 $500000
$750000 $750000 $750000 $750000
$1000000 $1000000 $1000000 $1000000 $1000000 $1000000 $1000000 $1000000 $1000000
Expected Value $151608 $178784 $206977 $190709 $250096 $300110 $375063 $333417
Offer $9000 $28000 $58000 $75000 $113000 $199000 $275000 $267000

The most obvious thing to be learned from these three charts are that the first four to six bank offers are terrible deals. The average suitcase has $131,477.54 before any are opened. To only offer $9000 to $13000 the first stage is a deal only a fool would make. However gradually the offers get better. Game 2 shows us the expected values were almost the same as the banker offers towards the end of the game when the player’s expected value was fairly low. However in games 1 and 3 when the expected values were higher the banker apparently was trying to take advantage of the risk averse nature of most people when large amounts are involved. I don’t know if it mattered but the contestant in game 2 appeared to be a gambler who wanted to win big. Based on comments by the host, who communicates to the banker by phone, the banker does appear to take the contestants words and actions into consideration. If I were in the banker’s shoes I would act much the same.

If the player is neither risk averse nor risk prone, and also ignoring tax implications, the player should keep refusing banker offers until one exceeds the average of the remaining suitcases. For most people the progressive nature of the income tax code favors taking a deal. As I have said before I would roughly say the value of money is proportional to the log of the amount. So the more wealth you have going into the game the more inclined you should be to gamble and refuse the banker offers. With such large amounts involved, no strategy will fit everybody. However I can fairly confidently say that the player should refuse the first four to six offers and then take the offers on a case by case basis (pun intended).

Links:
You can play Deal or No Deal at NBC.com.
Archive of past shows.

Watching "Deal or no Deal". I realize the "offer" from the banker is just the remaining values of the cases divided by the number of cases [give or take rounding]. Is there ANY strategy to this game at all, or is "the deal" always just an OK thing to take? Does it depend on how many cases you have to open or anything?

Darren from Elk Grove, CA

As my December 26, 2005 column shows the banker offer is usually much less than the average of the remaining cases. However, hypothetically, if it always were, then every strategy would have the same expected value. The player would be indifferent at every offer.

at the start of "deal or no deal" the odds of picking the 1,000,000 dollar case is 1 in 26. after eliminating all the cases exept 1, what are the odds that my case contains the million dollars. is it 50-50 or still 1 in 26 ?

Ken from Chester, NY

50-50

In your April 5, 2006 column you state that if there are only two cases left in Deal or no Deal and the million dollars is still in play then the probability my case has the million is 50-50. I disagree. Isn’t this just a variation of the Monty Hall problem? That is, the million is more likely to be on the stage than in his case?

Jason from Pasadena, CA

No. I’m getting lots of people arguing with me about this one. Many writers claim that probabilities can not change if additional information is introduced. So if the probability starts at 1 in 26 then it must stay there. Contrary to what betting system salesmen say, probabilities indeed can change as additional information is introduced. I don’t want to try to teach basic probability here but any college level math book on conditional probability or Bayes’ Theorem should cover this topic nicely.

Let me explain what happened on Let’s Make a Deal. The contestant would choose one of three curtains. One would contain a very valuable prize and the other two smaller prizes. For the sake of argument let’s say behind one curtain was a car and behind the other two a goat. Then Monty would always, I repeat ALWAYS, open up one of the two unchosen curtains to reveal a goat. After hundreds of shows this would imply that Monty Hall (the host) knew where the car was and deliberately opened a curtain that revealed a goat. Obviously when the player chose his curtain the probability it held the car was 1/3 and the probability one of the two unchosen curtains held the car was 2/3. Monty is then predestined to open an unchosen curtain containing a goal. Predestined is the key word here. Because Monty can not open the player’s curtain at this stage the probability of the player’s curtain reveals the car stays at 1/3. The probability an unchosen curtain reveales the car remains at 2/3, however it is now all on one curtain. So after a goat is revelead the probability the player’s curtain has the car is 1/3 and the probability the other unopened curtain has the car is 2/3, making switching a wise choise.

The following table shows all the possible outcomes. In the case where the player chose the curtain with the car I had Monty opening a curtain arbitrarily. You can see that not switching results in a 1/3 probability of winning, and switching results in a 2/3 probability of winnning.

Let’s Make a Deal

Player
Chooses
Car Curtain
Opened
Probability Win by
Switching
1 1 1 0% n/a
1 1 2 5.56% N
1 1 3 5.56% N
1 2 1 0% n/a
1 2 2 0% n/a
1 2 3 11.11% Y
1 3 1 0% n/a
1 3 2 11.11% Y
1 3 3 0% n/a
2 1 1 0% n/a
2 1 2 0% n/a
2 1 3 11.11% Y
2 2 1 5.56% N
2 2 2 0% n/a
2 2 3 5.56% N
2 3 1 11.11% Y
2 3 2 0% n/a
2 3 3 0% n/a
3 1 1 0% n/a
3 1 2 11.11% Y
3 1 3 0% n/a
3 2 1 11.11% Y
3 2 2 0% n/a
3 2 3 0% n/a
3 3 1 5.56% N
3 3 2 5.56% N
3 3 3 0% n/a

Meanwhile in Deal or No Deal nothing is predestined. Let’s assume on Deal or No Deal the amounts remaining were $0.01, $1, and $1,000,000. With three cases left it IS possible that the opened case will contain the million dollars. The following table shows the possible outcomes with three cases left. Remember, the player can not open his own case.

Deal or No Deal

Player
Chooses
Million $ Case
Opened
Probability Win by
Switching
1 1 1 0% n/a
1 1 2 5.56% N
1 1 3 5.56% N
1 2 1 0% n/a
1 2 2 5.56% Hopeless
1 2 3 5.56% Y
1 3 1 0% n/a
1 3 2 5.56% Y
1 3 3 5.56% Hopeless
2 1 1 5.56% Hopeless
2 1 2 0% n/a
2 1 3 5.56% Y
2 2 1 5.56% N
2 2 2 0% n/a
2 2 3 5.56% N
2 3 1 5.56% Y
2 3 2 0% n/a
2 3 3 5.56% Hopeless
3 1 1 5.56% Hopeless
3 1 2 5.56% Y
3 1 3 0% n/a
3 2 1 5.56% Y
3 2 2 5.56% Hopeless
3 2 3 0% n/a
3 3 1 5.56% N
3 3 2 5.56% N
3 3 3 0% n/a

What the Deal or No Deal table shows is that with three cases left the probability the player opens the million dollar case is 1/3 (hopeless to win), the probability a switching player will win is 1/3, and the probability a switching player will lose is 1/3. Thus the odds are the same to switch cases. Once there are only two cases left the probability each case contains the larger prize is 50/50.

Time for another Deal or No Deal question. Let’s say after all the deals from the banker and guest appearances by Celine Dion, you’re left with two suitcases, the $500,000 and the $1,000,000. The banker’s offer will be slightly less than $750,000 I assume. Which would you choose? What if the two briefcase left were the $.01 and $1,000,000 one? I guess it’s all a matter if you’re a gambler or not, and nothing really to do with odds. The reason why I’m asking is I wonder if ANYBODY will ever win $1,000,000 (even if they’ve picked the magic briefcase).

Jason from Vancouver

When the prizes become life-changing amounts, the wise player should play conservatively at the expense of maximizing expected value. A good strategy should be to maximize expected happiness. A good function to measure happiness I think is the log of your total wealth. Let’s take a person with existing wealth of $100,000 who is presented with two cases of $0.01 and $1,000,000. By taking “no deal” the expected happiness is 0.5*log($100,000.01) + 0.5*log($1,100,000) = 5.520696. Let b be the bank offer where the player is indifferent to taking it.

log(b) = 5.520696
b = 105.520696
b = $331,662.50.

So this hypothetical player should be indifferent at a bank offer of $331,662.50. The lesser your wealth going into the game the more conservatively you should play. Usually in the late stages of the game the bank offers are close to expected value, sometimes a little more bit more. The only rational case where a player could win the million is if he had a lot of wealth going into the game and/or the bank offers were unusually stingy. The producers seem to like hard-working middle class people, so we’re unlikely to see somebody who can afford to be cavalier when large amounts are involved. I have also never seen the bank make offers under 90% of expected value late in the game. The time when we will see somebody win the million is when a degenerate gambler gets on the show who can’t stop. When that happens I will be rooting for the banker.

In a two-player case, what is optimal strategy in Final Jeoporday?

Eliot from Santa Barbara

First, for the sake of simplicity, I’m going to assume that for any given question, all players have the same probability of producing the correct answer. Also, questions range in difficulty, and this probability itself is random. Thus, the results among players will be correlated.

I know that the equal probability of answering correctly among players is an unrealistic assumption. For one thing the leader probably has a higher probability of answering correctly. For another, each player knows the category before wagering. Give me a category of gambling and I’m close to a lock, but if is poetry, I’m doomed. So don’t write to me about how unrealistic my assumptions are, because I acknowledge that already. This is meant to be more of an exercise in game theory than practical advice for future contestants.

Next, we will need some data to look at. From j-archive.com we find that in season 22, in 2005, Final Jeopardy was answered correctly 43.80% of players. We also learn that all three contestants got it right 14.92% of the time, and all three got it wrong 24.86% of the time. Let pn be the probability n players get the question right. The first equation to work with is:

p0 + p1 + p2 + p3 = 1

Substituting the known values for p0 and p3:

0.2486 + p1 + p2 + 0.1492 = 1.

p1 + p2 = 0.6022.

(1) p1 = 0.6022 - p2.

We can also construct the probability of any given player getting the question right as follows.

p0×0 + p1×(1/3) + p2×(2/3) + p3×1 = 0.4380.

p1×(1/3) + p2×(2/3) + 0.1492 = 0.4380.

p1×(1/3) + p2×(2/3) = 0.2888.

Multiplying both sides by 3:

(2) p1 + p2×2 = 0.8664.

Substituting the value for p1 in equation (1) we get

0.6022 - p2 + p2×2 = 0.8664.

p2 = 0.2642

Solving for p1

p1 = 0.6022 - p2.

p1 = 0.6022 − 0.2642 = 0.3380.

Based on the three-player probabilities above, in a two-player game the probability both get the question right is:

p3 + (1/3)×0.2642 = 0.1492 + 0.2642×(1/3) = 0.2373.

The probability one gets it right is:

(2/3)×p2 + (1/3)×p1 = (2/3)×0.2642× + (1/3)×0.3380 = 0.4015.

The probability neither gets it right is:

(1/3)×p1 + p0 = 0.3380×(1/3) + 0.2486 = 0.3613.

It is obvious that if the leader has more than double the amount of the follower, he should bet less than the difference, to ensure victory. However, what should he do if his balance is less than double that of the follower? Let’s draw up a specific example where player A has $10,000 and player B has $8,000. To make matters easy, let’s restrict A’s options to making a big bet of $6001 (enough to guarantee victory with a correct answer) or a small bet of $1999 (leaving enough to guarantte victory if B answers incorrectly). Let’s restrict B’s options to a betting all or nothing.

At first glance, it would seem the right thing to do for player A to bet small, forcing player B to get the question right to win. That would give player A a 1 − 43.8% = 56.2% chance of winning. Assuming that strategy, player B would have to bet big to win. However, because of the correlation between players, if player B gets the question right, player A probably would too. The exact probability that player A gets the question right, given that player B got it right, by Bayes’ Theorem, is:

Probability(A and B correct)/Probability(B correct) = 0.2373/0.438 = 0.5417.

So, if player A knew player B would bet big, he should too. However, if player B knew player A would bet big, then he should bet small, and hope player A gets it wrong, ensuring a 56.2% chance of winning. Going further, if player A knew player B would be small, he would too, and have a 100% chance of winning. Assuming player A bet small, player B would of course bet big. And so we go around and around.

The optimal strategy for both players is to randomize their bet.

The following table shows the probability player A will win according to all four combinations of bets.

Probability Player A Wins

Player A Player B
High Low
High 0.7993 0.438
Low 0.562 1

The next stop is hard to explain why, but for either player the optimal probability of either option is proportional to the absolute difference in values of the other option. Player A should bet high with probability proportional to abs(0.5620 − 1) = 0.4380. He should bet low with probability proportional to abs(0.799267- 0.4380) = 0.3613. So, the actual probability of A betting high should be 0.4380/(0.4380 + 0.3613) = 0.548002. The probability of going low is obviously 1 − 0.548002 = 0.451998.

By the same logic, player B should go high with probability proportional to abs(0.438000 - 1) = 0.562000, and low with probability proportional to abs(0.799267 − 0.562000) = 0.237267. The actual probability of going high should be 0.562000/(0.562000 + 0.237267) = 0.703145. Thus, the probability B should go low is 1 - 0.703145 = 0.296855.

The next table shows the probability of all four outcomes in strategy.

Strategy Probabilities

Player A Player B
High Low Total
High 0.385325 0.162677 0.548002
Low 0.31782 0.134178 0.451998
Total 0.703145 0.296855 1

So, if both players follow optimal strategy, the probability of player A winning is 0.385325×0.799267 + 0.162677×0.438000 + 0.31782×0.562000 + 0.134178×1 = 0.692023. Any deviation from this from either player will cause his probability of winning to decrease.

This is a follow up on Deal or No Deal, which I watched for the first time recently. Your analysis assumes that the house doesn’t know the value of the money in the suitcase. However, in the show I watched, in the endgame both contestants had selected a valuable case, and both were offered (or would have been offered, as one had already quit) above expected value (EV) deals. In the most extreme case, a player "would have been" offered $687K when the two dollar amounts left were $500K and $750K. The only rational explanation for this is that the banker knows the value of the player’s suitcase and the deals offered are based on that.

Just my two cents, and no reply is necessary.

J.N.S. from Bellevue, WA

Thanks for not expecting a reply, but I usually do reply to game show questions. They claim in every episode that the amounts in the cases are randomly placed, and that neither Howie, nor the banker, know the results. This was never claimed on Let’s Make a Deal, where Monty Hall obviously did know. I too have seen the banker offer more than expected value as the last offer, especially when large amounts are involved. In my strong opinion, this is not because the banker knows what is inside the player’s case. In the 1950s there was a huge scandal when it became known that the show 21, as well as others, were fixed. There is no compelling reason to ruin a successful show, and the integrity of all game shows, to skim some prize money via the bank offers.

I can offer three theories why the banker sometimes offers more than the average of the remaining cases.

  1. The show tries to portray the banker as sweating the money in his office. Howie Mandel is often commenting on the banker’s mood and tone of voice. Maybe it makes the show more dramatic to think of the banker as a risk-averse bean counter, preferring to cut his losses, than risk giving out a big prize.
  2. The real banker truly is risk-averse. This is getting out of my area of expertise, but from my understanding, game and reality shows are usually produced by a company independent from the television network. These smaller companies will seek out an insurance company to mitigate the risk of contestants winning the larger prizes. In such a case, the insurance company would be the real banker, and may be influencing the behavior of the banker on the show. The insurance companies that insure odd-ball stuff like this are not gigantic, and may prefer playing it safe when large amounts are involved.

    In your example, the banker offer was 9.92% above expected value. If the banker were following the Kelly Criterion, such an offer would have been made with a total bankroll of only $782,008, which is less than the maximum prize. No self-respecting insurance company would be that conservative. Clearly, this reason alone cannot justify the offer in your example.

  3. The show is trying to make the contestants look stupid and greedy. Shows like Are You Smarter than a Fifth Grader and the Tonight Show's “Jaywalking” would not be successful if we didn’t find some satisfaction in laughing at the trivia-challenged. The shows Friend or Foe and The Weakest Link were outstanding at exposing greed in human nature. I must confess a sense of schadenfreude when a contestant refuses an above expected value offer, and walks with the lower amount in his case.

I tend to think the reason is a combination of these three reasons, but mainly the third.

If I ended this answer here, I’m sure I would get comments, questioning whether the hypothetical banker offers would have really been made. The implication being that they are puffed up for dramatic effect. I have recorded the specifics of 13 games. In one of them, with three cases left ($1,000; $5,000; and $50,000), the average was $18,667, and the offer was $21,000. That is 12.5% over the expected value. In another show, with two cases left ($400 and $750,000), the average was $375,200, and the offer was $400,000. That is 6.6% above expected value. So, I see no reason to question the integrity of the hypothetical offers.

Links:

Deal or no deal formula: This page shows old, and new, formulas for calculating the banker offer, based on the free game at the Deal or No Deal web site.

How much would you bet, in each person’s shoes, in Final Jeopardy, with these scores:

Player A: $10,000
Player B: $8,000
Player C: $3,500

Eliot from Santa Barbara, CA

Let me start by making some assumptions. First, I’m going to assume that the three players have no prior knowledge of betting behavior in Final Jeopardy, except the probabilities of being correct in the table presented. Second, I’m going to assume that knowing the category is of no help. Third, I’m also going to assume that all three contestants want to go for the win, not wishing to take another player along in a tie.

Let’s start with player C. He should anticipate that A might bet $6001, to stay above B if B is right. However, if A is wrong, that would lower him to $3999. C would need to bet at least $500, and be right, to beat A in such a scenario. However, in my opinion, if you must be right to win, you may as well bet big. So if I were C I would bet everything.

B is torn between betting big or small. A small bet should be $999 or less, to stay above C if C is correct. The benefit of a small bet is staying above C no matter what, hoping that A will go big, and be wrong. A big bet does not necessarily have to go the whole way, but it may as well. The benefit of a big bet is hoping that either A goes small, or goes big and is wrong, but both require B to be right.

A basically wants to go the same way as B. A small bet for A can be anything from $0 to $1000, which will stay above B if B bets $999. A big bet should be $6001, to guarantee a win if A is right, and still retain hope if B goes big, and all three players are wrong.

To help with the probabilities of the eight possible outcomes of right and wrong answers, I looked at the Final Jeopardy results for seasons 20 to 24, from j-archive.com. Here is what the results look like, where player A is the leader, followed by player B, and C in last.

Possible Outcomes in Final Jeopardy

Player A Player B Player C Probability
Right Right Right 21.09%
Right Right Wrong 9.73%
Right Wrong Right 10.27%
Wrong Right Right 8.74%
Right Wrong Wrong 13.33%
Wrong Right Wrong 10.27%
Wrong Wrong Right 8.63%
Wrong Wrong Wrong 17.92%

Using the kind of game theory logic I explain in problem 192 at my site mathproblems.info, I find that A and B should randomize their strategy as follows.

Player A should bet big with probability 73.6% and small with probability 26.4%.
Player B should bet big with probability 67.3% and small with probability 32.7%.
Player C should bet big with probability 100.0%.

If this strategy is followed, the probability of each player’s winning will be as follows:

Player A: 66.48%
Player B: 27.27%
Player C: 6.25%

As an aside, based on the table above, the probability of the leader getting Final Jeopardy correct is 54.4%, for the second-place player, 49.8%, and 48.7% for the third-place player. The overall probability is 51.0%.

As a practical note, players do have knowledge of betting behavior. In my judgment, players tend to bet big more often than mathematically justified. Interestingly, I find wagering in Daily Double to be too conservative than mathematically justified. One of the reasons I believe Ken Jennings did so well was aggressive wagering on the Double Doubles. Anyway, in reality if I were actually on the show, I would assume the other two players would bet aggressively. So my actual wagers would be $6000 as A (being nice to B), $0 as B, and $3495 as C (leaving a little un-bet, in case A foolishly bets everything or all but $1, and is wrong).

Before somebody challenges me about how one could draw a random number in the actual venue, let me suggest the Stanford Wong strategy of using the second hand of your watch to draw a random number from 1 to 60.

A new game show has premiered in the UK, called the "Colour of Money." A lone contestant is randomly given a target amount, which has been known to range from £55,000 to £79,000. To earn money, he picks 10 of 20 bank machines, each containing £1,000 to £20,000, in even increments of £1,000. When he picks a machine, it will begin counting upwards from £1,000, in £1,000 increments.

The player may yell "Stop!" at any time, and he will bank the amount showing on screen. If the player does not stop in time, and the machine runs out of money, then he banks nothing. A hostess provides statistics, such as number of machines left to pick, amount left to earn, average amount needed per machine to win, and what amounts remain in the machines.

A player can "play the gaps," in that if a run of machines have been picked, say, £4k, £5k, and £6k, a machine would be guaranteed to make it to £7,000 once it passes the £3,000 mark. My question is, what kind of strategy should a player use?

James Key from Louisville, KY

This is the kind of thing I could spend weeks analyzing. Unfortunately, I read your message almost three months after you wrote, due to a large backlog of “ask the Wizard” questions. The Wikipedia page seems to indicate that that show was a flop, and was canceled. However, it still makes for an interesting problem.

The hostess conveniently tells you the average amount you need per remaining machine to reach your game. After hours of scribbling, I can’t come up with anything better than setting a stopping goal of about 25% higher than the required average. That is just an educated guess, so please don’t ask me to prove it is optimal. As you noted, also ride the gaps, never stopping just before an amount that was already picked.

When there are only two machines left, if the total amount needed is £13,000 or less, I would try to get it all in the second-to-last machine. If £14,000 or more, I would try to get half of it at the next machine.

If they should bring back this show, I hope my UK readers will let me know. This is the kind of puzzle that I could become obsessed with, like the Eternity puzzle, which was coincidentally (or not) also out of the UK.

P.S. Why do you spell "colour" with a u in the UK? It makes no sense to me.

What is the average prize per punch and optimal strategy for the Punch a Bunch game on The Price is Right?

Ibeatyouraces

For those not familiar with the rules, they are explained at the Price Is Right web site. Please take a moment to go there if you’re not familiar with the game, because I’m going to assume you know the rules. There are several YouTube videos of the game as well. Here is an old one, which shows a second chance, but the maximum prize at the time was $10,000 only. It is now $25,000.

First, let’s calculate the expected value of a prize that is not paired with a second chance. The following table shows that average is $1371.74.

Punch a Bunch Prize Distribution with no Second Chance

Prize Number Probability Expected Win
25000 1 0.021739 543.478261
10000 1 0.021739 217.391304
5000 3 0.065217 326.086957
1000 5 0.108696 108.695652
500 9 0.195652 97.826087
250 9 0.195652 48.913043
100 9 0.195652 19.565217
50 9 0.195652 9.782609
Total 46 1.000000 1371.739130

Second, calculate the average prize that does have a second chance. The following table shows that average is $225.

Punch a Bunch Prize Distribution with Second Chance

Prize Number Probability Expected Win
500 1 0.250000 125.000000
250 1 0.250000 62.500000
100 1 0.250000 25.000000
50 1 0.250000 12.500000
Total 4 1.000000 225.000000

Third, create an expected value table based on the number of second chances the player finds. This can be found using simple math. For example, the probability of 2 second chances is (4/50)×(3/49)×(46/48). The expected win given s second chances is $1371.74 + s×$225. The following table shows the probability and average win for 0 to 4 second chances.

Punch a Bunch Prize Return Table

Second Chances Probability Average Win Expected Win
4 0.000004 2271.739130 0.009864
3 0.000200 2046.739130 0.408815
2 0.004694 1821.739130 8.551020
1 0.075102 1596.739130 119.918367
0 0.920000 1371.739130 1262.000000
Total 1.000000 1390.888067

So the average win per punch (including additional money from second chances) is $1390.89.

The following table shows my strategy of the minimum win to accept, according to the number of punches remaining. Note the player can get to $1,400 with prizes of $1,000 + $250, + $100 + $50 via three second chances.

Punch a Bunch Strategy

Punches Remaining Minimum to Stand
3 $5,000
2 $5,000
1 $1,400

This question was raised and discussed in the forum of my companion site Wizard of Vegas.

What would be the optimal strategy for dividing your money on the game show Million Dollar Money Drop, if you were not sure of the answer?

Anon E. Mouse

For the benefit of other readers, let me review the rules first.

  1. A team of players starts with $1,000,000.
  2. The team is given a multiple choice question.
  3. The team is to divide his money among the possible answers. Whatever money is put on the correct answer will move onto the next question.
  4. The team must completely rule out at least one possible answer by not putting any money on it.
  5. This process repeats for several rounds. The player is also given one chance to change his mind.

Obviously, if the team is sure of the answer then he should put all his money on the correct answer. If the team can narrow down the answer to two, but assigns each a 50% chance of being correct, then they should divide his money equally between the two choices.

Where it gets more difficult is if the team leans towards one answer but doesn’t completely rule out one or more of the others. Let’s look at an example. Suppose the team determines the probability of each correct answer as follows: A 10%, B 20%, C 30%, D 40%. How should they divide up his money?

I claim the answer is to follow the Kelly Criterion. Briefly, the team should maximize the log of his wealth with every question. To do this, you have to consider how much wealth you already have.

Let’s say your existing wealth, which you have accumulated independently of the show, is $100,000. It is your first question, so you have $1,000,000 of game show money to split up. First eliminate the option with the lowest probability, to conform with the show rules. Then you want to maximize 0.2×log(100,000+b*1,000,000) + 0.3×log(100,000+c*1,000,000) + 0.4×log(100,000+d*1,000,000), where lower-case a, b, and c refer to the portion placed on each answer.

This could be solved with calculus and solving a trinomial equation, trial and error, or my preference, the "goal seek" feature in Excel. Whatever you use, the right answer is to put 18.9% on B, 33.3% on C, and 47.8% on D.

Of course, nobody on the show is going to be able to do all this math in the time allowed, not to mention that you also have to move a lot of bundles of cash as well in that time. My more practical advice is to just divide up the money in proportion to your assessment of the probability of the answer’s being correct, assuming the least likely choice is not a possibility. In the example, that would cause a split of 22.2% on B, 33.3% on C, and 44.4% on D.

This question was raised and discussed in the forum of my companion site Wizard of Vegas.

View Full Site View Mobile Site

Copyright © 1998-2014 Wizard of Odds Consulting, Inc. All rights reserved. Privacy & Terms. Recommended: Vegas Click Site by Priva