Formulas Del Poker

The term gambler's ruin is a statistical concept, most commonly expressed as the fact that a gambler playing a negative expected value game will eventually go broke, regardless of their betting system.

  1. Formulas Del Poker Tournaments
  2. Formulas Del Poker Game

The original meaning of the term is that a persistent gambler who raises his bet to a fixed fraction of bankroll when he wins, but does not reduce it when he loses, will eventually and inevitably go broke, even if he has a positive expected value on each bet.

Some variants of poker, called lowball, use a low hand to determine the winning hand. In most variants of lowball, the ace is counted as the lowest card and straights and flushes don't count against a low hand, so the lowest hand is the five-high hand A-2-3-4-5, also called a wheel. TOP 10 MOST AMAZING POKER HANDS EVER!Help us to 200K Subscribers - you are reading this, comment what poker video you want to see next.

Another common meaning is that a persistent gambler with finite wealth, playing a fair game (that is, each bet has expected value zero to both sides) will eventually and inevitably go broke against an opponent with infinite wealth. Such a situation can be modeled by a random walk on the real number line. In that context it is provable that the agent will return to his point of origin or go broke and is ruined an infinite number of times if the random walk continues forever. This is a corollary of a general theorem by Christiaan Huygens which is also known as gambler's ruin. That theorem shows how to compute the probability of each player winning a series of bets that continues until one's entire initial stake is lost, given the initial stakes of the two players and the constant probability of winning. This is the oldest mathematical idea that goes by the name gambler's ruin, but not the first idea to which the name was applied. The term's common usage today is another corollary to Huygens's result.

The concept may be stated as an ironic paradox: Persistently taking beneficial chances is never beneficial at the end. This paradoxical form of gambler's ruin should not be confused with the gambler's fallacy, a different concept.

The concept has specific relevance for gamblers; however it also leads to mathematical theorems with wide application and many related results in probability and statistics. Huygens's result in particular led to important advances in the mathematical theory of probability.

  1. Enjoy the videos and music you love, upload original content, and share it all with friends, family, and the world on YouTube.
  2. The term gambler's ruin is a statistical concept, most commonly expressed as the fact that a gambler playing a negative expected value game will eventually go broke, regardless of their betting system.
  3. Iginio Massari fa poker a Verona Loretta Meloni - 15 Dicembre 2020 0 E' stato inaugurato ieri a Verona in corso Sant’Anastasia. Il nuovo neonato punto vendita del famoso maestro pasticcere Iginio Massari.

History[edit]

The earliest known mention of the gambler's ruin problem is a letter from Blaise Pascal to Pierre Fermat in 1656 (two years after the more famous correspondence on the problem of points).[1] Pascal's version was summarized in a 1656 letter from Pierre de Carcavi to Huygens:

Let two men play with three dice, the first player scoring a point whenever 11 is thrown, and the second whenever 14 is thrown. But instead of the points accumulating in the ordinary way, let a point be added to a player's score only if his opponent's score is nil, but otherwise let it be subtracted from his opponent's score. It is as if opposing points form pairs, and annihilate each other, so that the trailing player always has zero points. The winner is the first to reach twelve points; what are the relative chances of each player winning?[2]

Huygens reformulated the problem and published it in De ratiociniis in ludo aleae ('On Reasoning in Games of Chance', 1657):

Problem (2-1) Each player starts with 12 points, and a successful roll of the three dice for a player (getting an 11 for the first player or a 14 for the second) adds one to that player's score and subtracts one from the other player's score; the loser of the game is the first to reach zero points. What is the probability of victory for each player?[3]

This is the classic gambler's ruin formulation: two players begin with fixed stakes, transferring points until one or the other is 'ruined' by getting to zero points. However, the term 'gambler's ruin' was not applied until many years later.[4]

Reasons for the four results[edit]

Let 'bankroll' be the amount of money a gambler has at his disposal at any moment, and let N be any positive integer. Suppose that he raises his stake to bankrollN{displaystyle {frac {text{bankroll}}{N}}} when he wins, but does not reduce his stake when he loses. This general pattern is not uncommon among real gamblers, and casinos encourage it by 'chipping up' winners (giving them higher denomination chips). [5] Under this betting scheme, it will take at most N losing bets in a row to bankrupt him. If his probability of winning each bet is less than 1 (if it is 1, then he is no gambler), he will eventually lose N bets in a row, however big N is. It is not necessary that he follow the precise rule, just that he increase his bet fast enough as he wins. This is true even if the expected value of each bet is positive.

The gambler playing a fair game (with 0.5 probability of winning) will eventually either go broke or double his wealth. Let's define that the game ends upon either event. These events are equally likely, or the game would not be fair. So he has a 0.5 chance of going broke before doubling his money. Given he doubles his money, a new game begins and he again has a 0.5 chance of doubling his money before going broke. After the second game there is a 1/2 x 1/2 chance that he has not gone broke in the first and second games. Continuing this way, his chance of not going broke after n successive games is 1/2 x 1/2 x 1/2 x . . . 1/2^n which approaches 0. His chance of going broke after n successive games is 0.5 + 0.25 + 0.125 + . . . 1 - 1/2^n which approaches 1.

Huygens's result is illustrated in the next section.

The eventual fate of a player at a negative expected value game cannot be better than the player at a fair game, so he will go broke as well.

Example of Huygens's result[edit]

Fair coin flipping[edit]

Consider a coin-flipping game with two players where each player has a 50% chance of winning with each flip of the coin. After each flip of the coin the loser transfers one penny to the winner. The game ends when one player has all the pennies.

If there are no other limitations on the number of flips, the probability that the game will eventually end this way is 1. (One way to see this is as follows. Any given finite string of heads and tails will eventually be flipped with certainty: the probability of not seeing this string, while high at first, decays exponentially. In particular, the players would eventually flip a string of heads as long as the total number of pennies in play, by which time the game must have already ended.)

If player one has n1 pennies and player two n2 pennies, the probabilities P1 and P2 that players one and two, respectively, will end penniless are:

P1=n2n1+n2P2=n1n1+n2{displaystyle {begin{aligned}P_{1}&={frac {n_{2}}{n_{1}+n_{2}}}[5pt]P_{2}&={frac {n_{1}}{n_{1}+n_{2}}}end{aligned}}}

Two examples of this are if one player has more pennies than the other; and if both players have the same number of pennies.In the first case say player one (P1){displaystyle (P_{1})} has 8 pennies and player two (P2{displaystyle P_{2}}) were to have 5 pennies then the probability of each losing is:

P1=58+5=513=0.3846 or 38.46%P2=88+5=813=0.6154 or 61.54%{displaystyle {begin{aligned}P_{1}&={frac {5}{8+5}}={frac {5}{13}}=0.3846{text{ or }}38.46%[6pt]P_{2}&={frac {8}{8+5}}={frac {8}{13}}=0.6154{text{ or }}61.54%end{aligned}}}

It follows that even with equal odds of winning the player that starts with fewer pennies is more likely to fail.

In the second case where both players have the same number of pennies (in this case 6) the likelihood of each losing is:

P1=66+6=612=12=0.5P2=66+6=612=12=0.5{displaystyle {begin{aligned}P_{1}&={frac {6}{6+6}}={frac {6}{12}}={frac {1}{2}}=0.5[5pt]P_{2}&={frac {6}{6+6}}={frac {6}{12}}={frac {1}{2}}=0.5end{aligned}}}

Unfair coin flipping[edit]

In the event of an unfair coin, where player one wins each toss with probability p, and player two wins with probability q = 1 − p, then the probability of each ending penniless is:

P1=1(pq)n21(pq)n1+n2P2=1(qp)n11(qp)n1+n2{displaystyle {begin{aligned}P_{1}&={frac {1-({frac {p}{q}})^{n_{2}}}{1-({frac {p}{q}})^{n_{1}+n_{2}}}}[5pt]P_{2}&={frac {1-({frac {q}{p}})^{n_{1}}}{1-({frac {q}{p}})^{n_{1}+n_{2}}}}end{aligned}}}

This can be shown as follows: Consider the probability of player 1 experiencing gamblers ruin having started with n>1{displaystyle n>1} amount of money, P(Rn){displaystyle P(R_{n})}. Then, using the Law of Total Probability, we have

P(Rn)=P(RnW)P(W)+P(RnW¯)P(W¯),{displaystyle P(R_{n})=P(R_{n}mid W)P(W)+P(R_{n}mid {bar {W}})P({bar {W}}),}

where W denotes the event that player 1 wins the first bet. Then clearly P(W)=p{displaystyle P(W)=p} and P(W¯)=1p=q{displaystyle P({bar {W}})=1-p=q}. Also P(RnW){displaystyle P(R_{n}mid W)} is the probability that player 1 experiences gambler's ruin having started with n+1{displaystyle n+1} amount of money: P(Rn+1){displaystyle P(R_{n+1})}; and P(RnW¯){displaystyle P(R_{n}mid {bar {W}})} is the probability that player 1 experiences gambler's ruin having started with n1{displaystyle n-1} amount of money: P(Rn1){displaystyle P(R_{n-1})}.

Denoting qn=P(Rn){displaystyle q_{n}=P(R_{n})}, we get the linear homogeneous recurrence relation

qn=qn+1p+qn1q,{displaystyle q_{n}=q_{n+1}p+q_{n-1}q,}

which we can solve using the fact that q0=1{displaystyle q_{0}=1} (i.e. the probability of gambler's ruin given that player 1 starts with no money is 1), and qn1+n2=0{displaystyle q_{n_{1}+n_{2}}=0} (i.e. the probability of gambler's ruin given that player 1 starts with all the money is 0.) For a more detailed description of the method see e.g. Feller (1970), An introduction to probability theory and its applications, 3rd ed.

N-player ruin problem[edit]

The above described problem (2 players) is a special case of the so-called N-Player ruin problem.Here N2{displaystyle Ngeq 2,} players with initial capital x1,x2,,xN{displaystyle x_{1},x_{2},ldots ,x_{N},} dollars, respectively,play a sequence of (arbitrary) independent games and win and lose certain amounts of dollars from/to each other according to fixed rules.The sequence of games ends as soon as at least one player is ruined. Standard Markov chain methods can be applied tosolve in principle this more general problem, but the computations quickly become prohibitive as soon as the number of playersor their initial capital increase. For N=2{displaystyle N=2,} and large initial capitals x1,x2{displaystyle x_{1},x_{2},}the solution can be well approximated by using two-dimensional Brownian motion. (For N3{displaystyle Ngeq 3} this is not possible.)In practice the true problem is to find the solution for the typical cases of N3{displaystyle Ngeq 3} and limited initial capital.Swan (2006) proposed an algorithm based on Matrix-analytic methods (Folding algorithm for ruin problems) which significantlyreduces the order of the computational task in such cases.

See also[edit]

Notes[edit]

  1. ^David, Florence Nightingale (1998). Games, Gods, and Gambling: A History of Probability and Statistical Ideas. Courier Dover Publications. ISBN978-0486400235.
  2. ^Edwards, J. W. F. (April 1983). 'Pascal's Problem: The 'Gambler's Ruin''. Revue Internationale de Statistique. 51 (1): 73–79. doi:10.2307/1402732. JSTOR1402732.
  3. ^Jan Gullberg, Mathematics from the birth of numbers, W. W. Norton & Company; ISBN978-0-393-04002-9
  4. ^Kaigh, W. D. (April 1979). 'An attrition problem of gambler's ruin'. Mathematics Magazine. 52.
  5. ^'Chipping Up In Poker'. Retrieved 2020-10-26.

References[edit]

  • R., Epstein (1995). The Theory of Gambling and Statistical Logic (Revised ed.). Academic Press.
Formula
  • Ferguson T. S.Gamblers Ruin in Three Dimensions. Unpublished manuscript: https://www.math.ucla.edu/~tom/
  • M., Kraitchik (1942). '§6.20 : The Gambler's Ruin'. Mathematical Recreations. New York: W. W. Norton. p. 140.
  • Shoesmith, E (1986). 'Huygens' solution to the gambler's ruin problem'. Historia Math. 13 (2): 157–164. doi:10.1016/0315-0860(86)90028-5.
  • Stigler, Stephen M. (1990). The History of Statistics: The Measurement of Uncertainty before 1900. Belknap Press. ISBN978-0-674-40341-3.
  • Swan, Yves C.; Bruss, F. Thomas (2006). 'A Matrix-Analytic Approach to the N-Player Ruin Problem'. Journal of Applied Probability. 4 (3): 755–766. doi:10.1017/S0021900200002084.

External links[edit]

  • The Gambler's Ruin at MathPages
  • The Gambler’s Ruin Simulation at Wolfram Demonstration Project
Retrieved from 'https://en.wikipedia.org/w/index.php?title=Gambler%27s_ruin&oldid=985632559'

The term gambler's ruin is a statistical concept, most commonly expressed as the fact that a gambler playing a negative expected value game will eventually go broke, regardless of their betting system.

The original meaning of the term is that a persistent gambler who raises his bet to a fixed fraction of bankroll when he wins, but does not reduce it when he loses, will eventually and inevitably go broke, even if he has a positive expected value on each bet.

Another common meaning is that a persistent gambler with finite wealth, playing a fair game (that is, each bet has expected value zero to both sides) will eventually and inevitably go broke against an opponent with infinite wealth. Such a situation can be modeled by a random walk on the real number line. In that context it is provable that the agent will return to his point of origin or go broke and is ruined an infinite number of times if the random walk continues forever. This is a corollary of a general theorem by Christiaan Huygens which is also known as gambler's ruin. That theorem shows how to compute the probability of each player winning a series of bets that continues until one's entire initial stake is lost, given the initial stakes of the two players and the constant probability of winning. This is the oldest mathematical idea that goes by the name gambler's ruin, but not the first idea to which the name was applied. The term's common usage today is another corollary to Huygens's result.

The concept may be stated as an ironic paradox: Persistently taking beneficial chances is never beneficial at the end. This paradoxical form of gambler's ruin should not be confused with the gambler's fallacy, a different concept.

The concept has specific relevance for gamblers; however it also leads to mathematical theorems with wide application and many related results in probability and statistics. Huygens's result in particular led to important advances in the mathematical theory of probability.

History[edit]

The earliest known mention of the gambler's ruin problem is a letter from Blaise Pascal to Pierre Fermat in 1656 (two years after the more famous correspondence on the problem of points).[1] Pascal's version was summarized in a 1656 letter from Pierre de Carcavi to Huygens:

Let two men play with three dice, the first player scoring a point whenever 11 is thrown, and the second whenever 14 is thrown. But instead of the points accumulating in the ordinary way, let a point be added to a player's score only if his opponent's score is nil, but otherwise let it be subtracted from his opponent's score. It is as if opposing points form pairs, and annihilate each other, so that the trailing player always has zero points. The winner is the first to reach twelve points; what are the relative chances of each player winning?[2]

Huygens reformulated the problem and published it in De ratiociniis in ludo aleae ('On Reasoning in Games of Chance', 1657):

Formulas Del PokerDel

Problem (2-1) Each player starts with 12 points, and a successful roll of the three dice for a player (getting an 11 for the first player or a 14 for the second) adds one to that player's score and subtracts one from the other player's score; the loser of the game is the first to reach zero points. What is the probability of victory for each player?[3]

This is the classic gambler's ruin formulation: two players begin with fixed stakes, transferring points until one or the other is 'ruined' by getting to zero points. However, the term 'gambler's ruin' was not applied until many years later.[4]

Reasons for the four results[edit]

Let 'bankroll' be the amount of money a gambler has at his disposal at any moment, and let N be any positive integer. Suppose that he raises his stake to bankrollN{displaystyle {frac {text{bankroll}}{N}}} when he wins, but does not reduce his stake when he loses. This general pattern is not uncommon among real gamblers, and casinos encourage it by 'chipping up' winners (giving them higher denomination chips). [5] Under this betting scheme, it will take at most N losing bets in a row to bankrupt him. If his probability of winning each bet is less than 1 (if it is 1, then he is no gambler), he will eventually lose N bets in a row, however big N is. It is not necessary that he follow the precise rule, just that he increase his bet fast enough as he wins. This is true even if the expected value of each bet is positive.

The gambler playing a fair game (with 0.5 probability of winning) will eventually either go broke or double his wealth. Let's define that the game ends upon either event. These events are equally likely, or the game would not be fair. So he has a 0.5 chance of going broke before doubling his money. Given he doubles his money, a new game begins and he again has a 0.5 chance of doubling his money before going broke. After the second game there is a 1/2 x 1/2 chance that he has not gone broke in the first and second games. Continuing this way, his chance of not going broke after n successive games is 1/2 x 1/2 x 1/2 x . . . 1/2^n which approaches 0. His chance of going broke after n successive games is 0.5 + 0.25 + 0.125 + . . . 1 - 1/2^n which approaches 1.

Huygens's result is illustrated in the next section.

The eventual fate of a player at a negative expected value game cannot be better than the player at a fair game, so he will go broke as well.

Poker

Example of Huygens's result[edit]

Fair coin flipping[edit]

Consider a coin-flipping game with two players where each player has a 50% chance of winning with each flip of the coin. After each flip of the coin the loser transfers one penny to the winner. The game ends when one player has all the pennies.

If there are no other limitations on the number of flips, the probability that the game will eventually end this way is 1. (One way to see this is as follows. Any given finite string of heads and tails will eventually be flipped with certainty: the probability of not seeing this string, while high at first, decays exponentially. In particular, the players would eventually flip a string of heads as long as the total number of pennies in play, by which time the game must have already ended.)

If player one has n1 pennies and player two n2 pennies, the probabilities P1 and P2 that players one and two, respectively, will end penniless are:

P1=n2n1+n2P2=n1n1+n2{displaystyle {begin{aligned}P_{1}&={frac {n_{2}}{n_{1}+n_{2}}}[5pt]P_{2}&={frac {n_{1}}{n_{1}+n_{2}}}end{aligned}}}

Two examples of this are if one player has more pennies than the other; and if both players have the same number of pennies.In the first case say player one (P1){displaystyle (P_{1})} has 8 pennies and player two (P2{displaystyle P_{2}}) were to have 5 pennies then the probability of each losing is:

P1=58+5=513=0.3846 or 38.46%P2=88+5=813=0.6154 or 61.54%{displaystyle {begin{aligned}P_{1}&={frac {5}{8+5}}={frac {5}{13}}=0.3846{text{ or }}38.46%[6pt]P_{2}&={frac {8}{8+5}}={frac {8}{13}}=0.6154{text{ or }}61.54%end{aligned}}}

It follows that even with equal odds of winning the player that starts with fewer pennies is more likely to fail.

In the second case where both players have the same number of pennies (in this case 6) the likelihood of each losing is:

P1=66+6=612=12=0.5P2=66+6=612=12=0.5{displaystyle {begin{aligned}P_{1}&={frac {6}{6+6}}={frac {6}{12}}={frac {1}{2}}=0.5[5pt]P_{2}&={frac {6}{6+6}}={frac {6}{12}}={frac {1}{2}}=0.5end{aligned}}}

Unfair coin flipping[edit]

Poker

In the event of an unfair coin, where player one wins each toss with probability p, and player two wins with probability q = 1 − p, then the probability of each ending penniless is:

P1=1(pq)n21(pq)n1+n2P2=1(qp)n11(qp)n1+n2{displaystyle {begin{aligned}P_{1}&={frac {1-({frac {p}{q}})^{n_{2}}}{1-({frac {p}{q}})^{n_{1}+n_{2}}}}[5pt]P_{2}&={frac {1-({frac {q}{p}})^{n_{1}}}{1-({frac {q}{p}})^{n_{1}+n_{2}}}}end{aligned}}}

This can be shown as follows: Consider the probability of player 1 experiencing gamblers ruin having started with n>1{displaystyle n>1} amount of money, P(Rn){displaystyle P(R_{n})}. Then, using the Law of Total Probability, we have

P(Rn)=P(RnW)P(W)+P(RnW¯)P(W¯),{displaystyle P(R_{n})=P(R_{n}mid W)P(W)+P(R_{n}mid {bar {W}})P({bar {W}}),}

where W denotes the event that player 1 wins the first bet. Then clearly P(W)=p{displaystyle P(W)=p} and P(W¯)=1p=q{displaystyle P({bar {W}})=1-p=q}. Also P(RnW){displaystyle P(R_{n}mid W)} is the probability that player 1 experiences gambler's ruin having started with n+1{displaystyle n+1} amount of money: P(Rn+1){displaystyle P(R_{n+1})}; and P(RnW¯){displaystyle P(R_{n}mid {bar {W}})} is the probability that player 1 experiences gambler's ruin having started with n1{displaystyle n-1} amount of money: P(Rn1){displaystyle P(R_{n-1})}.

Formulas Del Poker Tournaments

Denoting qn=P(Rn){displaystyle q_{n}=P(R_{n})}, we get the linear homogeneous recurrence relation

qn=qn+1p+qn1q,{displaystyle q_{n}=q_{n+1}p+q_{n-1}q,}

which we can solve using the fact that q0=1{displaystyle q_{0}=1} (i.e. the probability of gambler's ruin given that player 1 starts with no money is 1), and qn1+n2=0{displaystyle q_{n_{1}+n_{2}}=0} (i.e. the probability of gambler's ruin given that player 1 starts with all the money is 0.) For a more detailed description of the method see e.g. Feller (1970), An introduction to probability theory and its applications, 3rd ed.

N-player ruin problem[edit]

The above described problem (2 players) is a special case of the so-called N-Player ruin problem.Here N2{displaystyle Ngeq 2,} players with initial capital x1,x2,,xN{displaystyle x_{1},x_{2},ldots ,x_{N},} dollars, respectively,play a sequence of (arbitrary) independent games and win and lose certain amounts of dollars from/to each other according to fixed rules.The sequence of games ends as soon as at least one player is ruined. Standard Markov chain methods can be applied tosolve in principle this more general problem, but the computations quickly become prohibitive as soon as the number of playersor their initial capital increase. For N=2{displaystyle N=2,} and large initial capitals x1,x2{displaystyle x_{1},x_{2},}the solution can be well approximated by using two-dimensional Brownian motion. (For N3{displaystyle Ngeq 3} this is not possible.)In practice the true problem is to find the solution for the typical cases of N3{displaystyle Ngeq 3} and limited initial capital.Swan (2006) proposed an algorithm based on Matrix-analytic methods (Folding algorithm for ruin problems) which significantlyreduces the order of the computational task in such cases.

See also[edit]

Notes[edit]

  1. ^David, Florence Nightingale (1998). Games, Gods, and Gambling: A History of Probability and Statistical Ideas. Courier Dover Publications. ISBN978-0486400235.
  2. ^Edwards, J. W. F. (April 1983). 'Pascal's Problem: The 'Gambler's Ruin''. Revue Internationale de Statistique. 51 (1): 73–79. doi:10.2307/1402732. JSTOR1402732.
  3. ^Jan Gullberg, Mathematics from the birth of numbers, W. W. Norton & Company; ISBN978-0-393-04002-9
  4. ^Kaigh, W. D. (April 1979). 'An attrition problem of gambler's ruin'. Mathematics Magazine. 52.
  5. ^'Chipping Up In Poker'. Retrieved 2020-10-26.

References[edit]

  • R., Epstein (1995). The Theory of Gambling and Statistical Logic (Revised ed.). Academic Press.
  • Ferguson T. S.Gamblers Ruin in Three Dimensions. Unpublished manuscript: https://www.math.ucla.edu/~tom/
  • M., Kraitchik (1942). '§6.20 : The Gambler's Ruin'. Mathematical Recreations. New York: W. W. Norton. p. 140.
  • Shoesmith, E (1986). 'Huygens' solution to the gambler's ruin problem'. Historia Math. 13 (2): 157–164. doi:10.1016/0315-0860(86)90028-5.
  • Stigler, Stephen M. (1990). The History of Statistics: The Measurement of Uncertainty before 1900. Belknap Press. ISBN978-0-674-40341-3.
  • Swan, Yves C.; Bruss, F. Thomas (2006). 'A Matrix-Analytic Approach to the N-Player Ruin Problem'. Journal of Applied Probability. 4 (3): 755–766. doi:10.1017/S0021900200002084.

Formulas Del Poker Game

External links[edit]

  • The Gambler's Ruin at MathPages
  • The Gambler’s Ruin Simulation at Wolfram Demonstration Project
Retrieved from 'https://en.wikipedia.org/w/index.php?title=Gambler%27s_ruin&oldid=985632559'