## Archive for the ‘Mathematical Aspects’ Category

Like in most games of chance, Blackjack strategies never present players with one hundred percent guarantees of winning. In essence, utilizing mathematical probabilities is simply a way to minimize the innate house edge and make decisions based on the cards that you and the dealer have as well as those that are still left in the game.

Therefore, in spite of numerous guides that claim to possess the infallible secret, much like fanatical religious/spiritual leaders, you can easily understand why a foolproof methodology has not yet been developed. Joke aside, let’s first clarify the basic concepts of Blackjack.

**Quick walkthrough of the rules **

First of all, I have seen numerous guides to the game that suggest the objective of the player consists primarily of accumulating a sum of cards closest to 21, without going over this value. However, the real goal implies accumulating a sum of cards that is higher than the one of the dealer but that does not exceed 21. At first glance, the two statements are largely similar, but in fact there is a world of difference between them.

In the first case, the player should continue hitting (asking for more cards) until he is satisfied with the difference between the sum of his cards and the maximum value of 21, without taking into account the hand of the dealer. This erroneous strategy would substantially increase the house edge. On the other hand, if you remember that you don’t really need to aim for accumulating 21 points, but rather just over the value that the dealer holds, you stand to gain a lot more. The value of the cards in the game of blackjack is as following:

- 2 to 10 cards have the value imprinted on them
- The value of all face cards is 10 points
- The value of the Ace is either 11 (if the sum of the cards does not exceed 21) or 1 (if the sum of the cards exceeds 21)

The bettor will only see one of the cards of the dealer and, after being dealt the first 2 cards he will be able to:

**Stay**, meaning that he will not request any more cards**Hit**, meaning he will request an additional card**Split**, meaning the current hand is split into two individual hands when the cards dealt are the same (two Aces, two 8s, two 5s, etc.)**Double down**, the value of the bet is doubled and the player will only receive one more card

It is necessary to point out that some casinos enable the player to “surrender”, canceling the bet if he does not like the hand that he has been dealt. However, you should inquire on whether or not your current table utilizes this rule.

**What is the house edge in Blackjack?**

The house edge really depends on the number of decks that are in the game. Assuming that you are only playing with a singular deck, than the house edge is then calculated at 0.17 percent. The house edge increases with each additional deck brought into play, as following:

- 2 decks, 0.46%
- 4 decks, 0.60%
- 6 decks, 0.64%
- 8 decks, 0.66%

In order to understand why the house gains an edge over the bettor that is directly proportional to the number of decks in play, you need to understand that the game of Blackjack is based on dependant events. To put it simply, it becomes much more difficult to estimate the number of cards that are still in the game and to calculate the odds of the remaining cards accurately, which means that card counting no longer provides you with the same advantage. For example, when you are playing with a single deck, you know that once all the Aces are out, the probability that either you or the deal draws one is 0.

**An example of strategy**

As mentioned previously, there is no strategy that will function optimally every time, but rather decision making systems that have a mathematically higher chance of winning. For the purpose of understanding that better, we have compiled a list of such decisions in the following table:

Player’s Hard Hands | Dealer’s Hand | |

2 – 6 | 7 – Ace | |

4 – 8 | Hit | Hit |

9 | Double Down | Hit |

10 – 11 | Double Down | Double Down |

12 – 16 | Stay | Hit |

17 – 21 | Stay | Stay |

Player’s Soft Hands | Dealer’s Hand | |

13 – 15 | Hit | Hit |

16 – 18 | Double Down | Hit |

19 – 21 | Stay | Stay |

Player’s Split Hand | Dealer’s Hand | |

2-2, 3-3, 6-6, 7-7, 9-9 | Split | Don’t split |

8-8, Ace-Ace | Split | Split |

4-4, 5-5, 10-10 | Don’t Split | Don’t Split |

**Byline**

This guest post is written by Andrea, a passionate student of the theory behind casino games and writer at blackjackdomain.com.

**Advantage gambling**, or **advantage play**, refers to a practice of using legal ways to gain a mathematical advantage while gambling. The term usually refers to house-banked games, but can also refer to games played against other players, such as poker. Someone who practices advantage gambling is referred to as an** advantage gambler**, or an **advantage player**.

A skillful or knowledgeable player can gain an advantage at a number of games. Blackjack can usually be beaten with card counting and sometimes with shuffle tracking. Some video poker games can be beaten by the use of a strategy card devised by computer analysis of the game. Some progressive slot machines can eventually have such a high jackpot that they offer a positive return when played. Some online games can be beaten with *bonus hunting*.

## Sports and horse betting

Sports and horse betting can be beaten in the long run by skillful handicappers who only bet when they believe the line offers them an advantage. Sports and horse betting can also be beaten by placing arbitrage bets, which involve placing bets at different bookmakers who are offering different lines. Many online sports books now offer bonuses like free bets or free money. These bonuses usually come with a stipulation that the bettor place a certain number of bets. For example, a site may offer a bettor $50 free if they deposit $100 and place a total of $1000 in bets. These can reduce the “juice” or vig taken by the house or even offer the bettor a small advantage.^{
}

Another form of advantage can be found by betting the “middle” on a sports event. This situation occurs when two bookmakers are offering different lines on the same event, or if a bettor has placed a bet and the bookmaker changes the line. The bettor simply takes the most favorable lines at each bookmaker, and if the result of the contest is between the numbers, or in the “middle”, then the bettor wins both bets.

For example, Bookmaker A lists the Jets to be a 4-point favorite over the Bills. Bookmaker B has the Jets as just a 2-point favorite. The advantage player may bet the Bills +4 with Book A and then the Jets -2 with Book B. If the Jets win by 3, the advantage player collects on both bets. If the Jets win by either 2 or 4, the advantage player collects on one winning bet. And if the Jets win or lose by any other total, the two bets cancel out, leaving the advantage player to pay only the vigorish on the bets. Given typical 10-cent lines, a middle need only win 1 time in 20 to break even, which is a realistic goal – the middle is always a plausible result since it is based on the actual strength of the teams. Middling is an example of line arbitrage.

### Betting exchanges

Betting exchanges offer advantage players a chance to make a larger profit than possible with bookmakers because exchanges charge commission only on the net winnings in a particular betting market. One way to make money on the exchanges is “trading” – in the above example, the Jets might be a a favorite decimal odds of 1.90 to defeat the Bills. If a “trader” thinks these odds too long he may bet $1000 on the Jets, and should he prove correct and the odds on the Jets get shorter, “lay off” by laying, say, a $1016 bet on the Jets at 1.87. If the Jets win, he collects $900 on his bet on the Jets and pays out approximately $884 on the bet he layed against the Jets. If the Jets lose, he loses his $1000 stake on the Jets but keeps the $1016 stake on the bet he layed against the Jets. Either way, the “trader” makes a $16 profit and he will pay a commission only on that profit (usually not more than 5% or 80 cents in this example) for a net profit of $15.20 regardless of the result. Of course, if the odds go the wrong way the “trader” may lose money but exchanges do not charge a commission in the event of a net loss.

## Poker

Poker can offer a long-term advantage to a skilled player because it is played against other players and not against the house. The casino usually takes a rake or a session fee. A skilled poker player can often win enough from the game to cover the rake and make a profit.

## Other ways to gain an advantage

### Dice control

Experts disagree about whether or not an advantage can be gained at some other games. One example is dice control. Authors Stanford Wong and Frank Scoblete believe that by setting and throwing the dice in a certain way players can alter the odds at the game of craps enough to gain an advantage.

### Pachinko

In the Japanese game of pachinko, there are numerous purported strategies for winning, most reliably to use inside information to learn which machines have the highest payout settings.

### Angle shooting

“Angle shooting” is another type of advantage play. “Angle shooting” refers to legal but possibly unethical ways to beat casino games. One way to get an advantage at a casino is “*hole carding*” where a player tries to look at the dealer’s hole card in blackjack and then uses that information to play his hand differently. Taking advantage of incorrect payouts is another example of angle shooting. Not correcting an inexperienced dealer who pays 2 to 1 on a blackjack instead of 3 to 2 is an example of taking advantage of an incorrect payout.

Angle shooting may also be undertaken in poker. If, for example, an angle shooter attempts to bluff at the end of a hand, and is called, he may announce his hand as a flush even if it does not qualify. If the calling player throws away his hand, the angle shooter will claim the pot with his non-flush, claiming he made an honest error in announcing his hand. Similarly, angle shooters might hold on to a losing hand, hoping the winning hand will be mucked at showdown due to player or dealer error, and then claim the pot. A simple way to avoid being taken advantage of by angle shooting at poker is always protect one’s hand and always let the “cards speak”; that is, turning over one’s hand at the showdown for all to see.

### Comp hustling

Comp hustling can be another form of advantage gambling. Players who play games with a low house advantage can get more than their expected loss in free items from the casino. Many advantage players also take steps to maximize the comps they receive from their play.

## Hazards of advantage gambling

Casinos sometimes take measures to thwart players who they believe pose a threat to them, especially card-counters or hole-card players. But some casinos tolerate card-counters who don’t bet large amounts, who are not good at counting, or who don’t use a large betting spread. Some countermeasures include shuffling more frequently, imposing betting limits, “backing off” the player by asking him not to play blackjack any more, or asking the player to leave the casino. In New Jersey, a player may not be asked to leave a table for counting cards, although the house may still impose betting limits or shuffle sooner. Players caught counting cards or hole-carding ultimately may find themselves listed in the Griffin Book and become unwelcome in most casinos. Video poker and skillful progressive slot players are rarely ejected, but it has happened. They may have their comps reduced or eliminated. Skillful sports bettors may have their betting limits reduced and may not be allowed to take advantage of bonuses at online sports books.

Craps players are often stopped from playing if the dice fail to bounce off the back wall of the table.

Advantage players abide by the established rules of the game and thus, in most jurisdictions, are not regarded as committing fraud against the casino. So, while they may face the above casino-imposed sanctions, they are able to operate without the threat of criminal prosecution for their behavior.

The Gambler’s fallacy, also known as the *Monte Carlo fallacy* (because its most famous example happened in a Monte Carlo casino in 1913) or the fallacy of the maturity of chances, is the belief that if deviations from expected behaviour are observed in repeated independent trials of some random process then these deviations are likely to be evened out by opposite deviations in the future. For example, if a fair coin is tossed repeatedly and tails comes up a larger number of times than is expected, a gambler may incorrectly believe that this means that heads is more likely in future tosses. Such an expectation could be mistakenly referred to as being due. This is an informal fallacy. It is also known colloquially as the law of averages.

The gambler’s fallacy implicitly involves an assertion of negative correlation between trials of the random process and therefore involves a denial of the exchangeability of outcomes of the random process.

The reversal is also a fallacy, the inverse gambler’s fallacy, in which a gambler may instead decide that tails are more likely out of some mystical preconception that fate has thus far allowed for consistent results of tails; the false conclusion being: Why change if odds favor tails? Again, the fallacy is the belief that the “universe” somehow carries a memory of past results which tend to favor or disfavor future outcomes.

## An example: coin-tossing

The gambler’s fallacy can be illustrated by considering the repeated toss of a fair coin. With a fair coin, the outcomes in different tosses are statistically independent and the probability of getting heads on a single toss is exactly ^{1}⁄_{2} (one in two). It follows that the probability of getting two heads in two tosses is ^{1}⁄_{4} (one in four) and the probability of getting three heads in three tosses is ^{1}⁄_{8} (one in eight). In general, if we let *A _{i}* be the event that toss

*i*of a fair coin comes up heads, then we have,

- .

Now suppose that we have just tossed four heads in a row, so that if the next coin toss were also to come up heads, it would complete a run of five successive heads. Since the probability of a run of five successive heads is only ^{1}⁄_{32} (one in thirty-two), a believer in the gambler’s fallacy might believe that this next flip is less likely to be heads than to be tails. However, this is not correct, and is a manifestation of the gambler’s fallacy; the event of 5 heads in a row and the event of “first 4 heads, then a tails” are equally likely, each having probability ^{1}⁄_{32}. Given the first four rolls turn up heads, the probability that the next toss is a head is in fact,

- .

While a run of five heads is only ^{1}⁄_{32} = 0.03125, it is only that *before* the coin is first tossed. *After* the first four tosses the results are no longer unknown, so their probabilities are 1. Reasoning that it is more likely that the next toss will be a tail than a head due to the past tosses, that a run of luck in the past somehow influences the odds in the future, is the fallacy.

## Explaining why the probability is 1/2 for a fair coin

We can see from the above that, if one flips a fair coin 21 times, then the probability of 21 heads is 1 in 2,097,152. However, the probability of flipping a head *after having already flipped 20 heads in a row* is simply ^{1}⁄_{2}. This is an application of <i>Bayes’ theorem</i>.

This can also be seen without knowing that 20 heads have occurred for certain (without applying of Bayes’ theorem). Consider the following two probabilities, assuming a fair coin:

- probability of 20 heads, then 1 tail = 0.5
^{20}× 0.5 = 0.5^{21} - probability of 20 heads, then 1 head = 0.5
^{20}× 0.5 = 0.5^{21}

The probability of getting 20 heads then 1 tail, and the probability of getting 20 heads then another head are both 1 in 2,097,152. Therefore, it is equally likely to flip 21 heads as it is to flip 20 heads and then 1 tail when flipping a fair coin 21 times. Furthermore, these two probabilities are equally as likely as any other 21-flip combinations that can be obtained (there are 2,097,152 total); all 21-flip combinations will have probabilities equal to 0.5^{21}, or 1 in 2,097,152. From these observations, there is no reason to assume at any point that a change of luck is warranted based on prior trials (flips), because every outcome observed will always have been equally as likely as the other outcomes that were not observed for that particular trial, given a fair coin. Therefore, just as Bayes’ theorem shows, the result of each trial comes down to the base probability of the fair coin: ^{1}⁄_{2}.

## Other examples

There is another way to emphasize the fallacy. As already mentioned, the fallacy is built on the notion that previous failures indicate an increased probability of success on subsequent attempts. This is, in fact, the inverse of what actually happens, even on a fair chance of a successful event, given a set number of iterations. Assume you have a fair 16-sided die, and a win is defined as rolling a 1. Assume a player is given 16 rolls to obtain at least one win (1−p(rolling no ones)). The low winning odds are just to make the change in probability more noticeable. The probability of having at least one win in the 16 rolls is:

However, assume now that the first roll was a loss (93.75% chance of that, ^{15}⁄_{16}). The player now only has 15 rolls left and, according to the fallacy, should have a higher chance of winning since one loss has occurred. His chances of having at least one win are now:

Simply by losing one toss the player’s probability of winning dropped by 2%. By the time this reaches 5 losses (11 rolls left), his probability of winning on one of the remaining rolls will have dropped to ~50%. The player’s odds for at least one win in those 16 rolls has not increased given a series of losses; his odds have decreased because he has fewer iterations left to win. In other words, the previous losses in no way contribute to the odds of the remaining attempts, but there are fewer remaining attempts to gain a win, which results in a lower probability of obtaining it.

The player becomes more likely to lose in a set number of iterations as he fails to win, and eventually his probability of winning will again equal the probability of winning a single toss, when only one toss is left: 6.25% in this instance.

Some lottery players will choose the same numbers every time, or intentionally change their numbers, but both are equally likely to win any individual lottery draw. Copying the numbers that won the *previous* lottery draw gives an equal probability, although a rational gambler might attempt to predict other players’ choices and then deliberately avoid these numbers. Low numbers (below 31 and especially below 12) are popular because people play birthdays as their so-called lucky numbers; hence a win in which these numbers are over-represented is more likely to result in a shared payout.

A joke told among mathematicians demonstrates the nature of the fallacy. When flying on an aircraft, a man decides to always bring a bomb with him. “The chances of an aircraft having a bomb on it are very small,” he reasons, “and certainly the chances of having two are almost none!”

A similar example is in the book *The World According to Garp* when the hero Garp decides to buy a house a moment after a small plane crashes into it, reasoning that the chances of another plane hitting the house have just dropped to zero.

The most famous example happened in a Monte Carlo casino in the summer of 1913, when the ball fell in black 26 times in a row, an extremely uncommon occurrence, and gamblers lost millions of francs betting *against* black after the black streak happened. Gamblers reasoned incorrectly that the streak was causing an “imbalance” in the randomness of the wheel, and that it had to be followed a long streak of red.

## Non-examples of the fallacy

There are many scenarios where the gambler’s fallacy might superficially seem to apply but does not. When the probability of different events is *not* independent, the probability of future events can change based on the outcome of past events (statistical permutation). Formally, the system is said to have *memory*. An example of this is cards drawn without replacement. For example, once a jack is removed from the deck, the next draw is less likely to be a jack and more likely to be of another rank. Thus, the odds for drawing a jack, assuming that it was the first card drawn and that there are no jokers, have decreased from ^{4}⁄_{52} (7.69%) to ^{3}⁄_{51} (5.88%), while the odds for each other rank have increased from ^{4}⁄_{52} (7.69%) to ^{4}⁄_{51} (7.84%). This is how counting cards really works, when playing the game of blackjack.

The outcome of future events can be affected if external factors are allowed to change the probability of the events (e.g., changes in the rules of a game affecting a sports team’s performance levels). Additionally, an inexperienced player’s success may decrease after opposing teams discover his or her weaknesses and exploit them. The player must then attempt to compensate and randomize his strategy.

Many riddles trick the reader into believing that they are an example of the gambler’s fallacy, such as the Monty Hall problem.

## Non-example: unknown probability of event

When the probability of repeated events are *not known*, outcomes may not be equally probable. In the case of coin tossing, as a run of heads gets longer and longer, the likelihood that the coin is biased towards heads increases. If one flips a coin 21 times in a row and obtains 21 heads, one might rationally conclude a high probability of bias towards heads, and hence conclude that future flips of this coin are also highly likely to be heads. In fact, Bayesian inference can be used to show that when the long-run proportion of different outcomes are unknown but exchangeable (meaning that the random process from which they are generated may be biased but is equally likely to be biased in any direction) previous observations demonstrate the likely direction of the bias, such that the outcome which has occurred the most in the observed data is the most likely to occur again.

## Psychology behind the fallacy

Amos Tversky and Daniel Kahneman proposed that the gambler’s fallacy is a cognitive bias produced by a psychological heuristic called the *representativeness heuristic*. According to this view, “after observing a long run of red on the roulette wheel, for example, most people erroneously believe that black will result in a more representative sequence than the occurrence of an additional red”, so people expect that a short run of random outcomes should share properties of a longer run, specifically in that deviations from average should balance out. When people are asked to make up a random-looking sequence of coin tosses, they tend to make sequences where the proportion of heads to tails stays close to 0.5 in any short segment more so than would be predicted by chance; Kahneman and Tversky interpret this to mean that people believe short sequences of random events should be representative of longer ones.

The representativeness heuristic is also cited behind the related phenomenon of the clustering illusion, according to which people see streaks of random events as being non-random when such streaks are actually much more likely to occur in small samples than people expect.