# The Kelly criterion for gambling

Assume that a gambler has the possibility to bet a fraction ${f}$ of his capital in the outcome of a specific event. The Kelly criterion first presented in [1] and summarized below find the ${f}$ that maximizes the exponential rate of growth of the gambler’s capital under different scenarios, which is equivalent to maximizing period by period the expected log utility based on the current capital.

Discussion on why this choice of optimization makes sense was formally discussed in [2] and might be the subject of a future post. Intuitively, it makes sense to use this criterion if you bet regularly and reinvest your profits.

Exponential rate of growth

Lets define a quantity ${G}$ called the exponential rate of growth of the gambler’s capital, where

$\displaystyle G = \underset{N \rightarrow \infty}{lim} \frac{1}{N} \log \frac{V_N}{V_0} \ \ \ \ \ (1)$

and ${V_N}$ is the gambler’s capital after ${N}$ bets, ${V_0}$ is his starting capital, and the logarithm is to the base two. ${G}$ is the quantity we want to maximize.

Perfect knowledge

In the case of perfect knowledge, the gambler would know the outcome of the event before anyone else and would be able to bet his entire capital at each bet. Then, ${V_N = 2^N V_0}$ and ${G = 1}$.

Binary events

Consider now a binary event where the gambler has a probability ${p}$ of success and a probability ${q = 1 - p}$ of failure. In this case the gambler would go broke for large ${N}$ with probability ${1}$ if he betted all his capital in each bet, even though the expected value of his capital after ${N}$ bets is given by

$\displaystyle E[V_N] = (2p)^N V_0$

Because of that, let us assume that the gambler will bet a fraction ${l}$ of his capital each time. Then

$\displaystyle V_N = (1+l)^W (1-l)^L V_0$

where ${W}$ and ${L}$ are the number of wins and losses after ${N}$ bets. Following the definition given in Eq. (1), it can be shown that

$\displaystyle G = p \log (1 + l) + q \log(1-l),\text{ with prob. 1} \ \ \ \ \ (2)$

Maximizing Eq. (2) with respect to ${l}$ gives

$\displaystyle l = p - q \quad \text{ and } \quad G_{\text{max}} = 1 + p \log p + q\log q$

where ${p - q}$ is called the edge.

If the payoff is ${B}$ for a win and ${-1}$ for a loss, then the edge is ${Bp - q}$, the odds are ${B}$, and

$\displaystyle l = \frac{Bp - q}{B} = \frac{\text{edge}}{\text{odds}}$

Multiple outcome events

Lets now consider the case where the event has more than two possible outcomes, not necessarily equally likely.

– Fair odds and no “track take”

Lets first consider the case of fair odds and no “track take”, that is

$\displaystyle \text{odds}_s = \frac{1}{p_s}\quad \text{ and } \quad \sum \frac{1}{\text{odds}_s} = 1$

where ${p_s}$ is the probability of observing the outcome ${s}$ in a given event, as estimated by the entity offering the odds.

Consider ${a_s}$ to be the fraction of the gambler’s capital that he decides to bet on ${s}$ based on his belief of the probability of observing the outcome ${s}$ in a given event. The gambler’s estimated probability for an outcome ${s}$ will be denoted by ${p^{(g)}_s}$.

Since there is no “track take”, there is no loss in generality in assuming that

$\displaystyle \sum a_s = 1.$

That is, the gambler bets his total capital divided among the possible outcomes.

In this case, [1] have shown that

$\displaystyle a_s = p^{(g)}_s$

That is, the gambler should allocate his capital according to how likely he thinks each outcome is.

– Unfair odds and no “track take”

In this case

$\displaystyle \sum \frac{1}{\text{odds}_s} = 1$

but ${\text{odds}_s}$ are not necessarily equal to ${1/p_s}$. Since there is no track take we can still consider ${\sum a_s = 1}$.

Here, the value of ${a_s}$ that maximizes ${G}$ is again given by ${a_s = p^{(g)}_s}$. Interesting conclusions can be taken from this result:

• As with the case of fair odds, ${G}$ is maximized by putting ${a_s = p^{(g)}_s}$. That is, the gambler ignores the posted odds in placing his bets!
• Subject to ${\sum (1/\text{odds}_s) = 1}$, the value of ${G}$ is minimized when ${\text{odds}_s = 1/p_s}$. That is, any deviation from fair odds helps the gambler.

– When there is a “track take”

In case there is a track take, it can no longer be assumed that ${\sum a_s = 1}$. Let ${b = 1 - \sum a_s}$ be the fraction not bet by the gambler.

The maximization process derived in [1] may be summarized as follows:

• (a) Permute indices so that ${p^{(g)}_s \times \text{odds}_s \geq p^{(g)}_{s+1} \times \text{odds}_{s+1}}$
• (b) Set b equal to the minimum positive value of

$\displaystyle \frac{1 - p_t}{1 - \sigma _t},\quad \text{where}\quad p_t = \sum _1^t p^{(g)}_s,\quad \sigma_t = \sum _1^t 1/\text{odds}_t$

• (c) Set ${a_s = \max(p^{(g)}_s - b/\text{odds}_s, 0)}$. The ${a_s}$ will sum to ${1 - b}$.

It should be noted that if ${p^{(g)}_s \times \text{odds}_s < 1}$ for all ${s}$ no bets are placed. But if the largest ${p^{(g)}_s \times \text{odds}_s > 1}$ some bets might be made for which ${p^{(g)}_s \times \text{odds}_s < 1}$, i.e. the expected gain is negative. This violates the criterion of the classical gambler who never bets on such events.

References:

[1] Kelly, J. L. (1956). A new interpretation of information rate. Information Theory, IRE Transactions on, 2(3), 185-189.
[2] Breiman, L. (1961). Optimal gambling systems for favorable games.
[3] MacLean, L. C., Thorp, E. O., Ziemba, W. T. (Eds.). (2011). The Kelly capital growth investment criterion: Theory and practice (Vol. 3). world scientific.

# Declining marginal utility and the logarithmic utility function

I recently read the translation of Daniel Bernoulli’s paper from 1738. His work on utility function and measurement of risk was translated into english with the title “Exposition of a new theory on the measurement of risk” and published in Econometrica in 1954 [1]. This work is also contained in [2], which is an excellent book I recently acquired. The paper is easy to read and yet very powerful, specially if we consider it was written in 1738(!) with Daniel Bernoulli at age 25. The paper proposes the notion of declining marginal utility and its implications on decision making, and is considered a fundamental piece within modern decision theory.

Declining marginal utility

Prior to this work, it was assumed that decisions were made on an expected value or linear utility basis. Bernoulli then developed the concept of declining marginal utility, which lead to the logarithmic utility. The general idea of declining marginal utility, also referred to as “risk aversion” or “concavity” is crucial in modern decision theory.

He criticized the notion of linear utility with the following simple and intuitive example: Assume a lottery ticket pays ${20000}$ with ${50\%}$ chance or ${0}$ with ${50\%}$ chance, leading to an expected value of ${10000}$. He then concludes that a very poor person would be well advised to sell this lottery ticket by ${9000}$ (which is below the expected value) while a rich man would be ill-advised if he refuses to buy this lottery ticket by ${9000}$, meaning that a rule based solely on expected value makes no sense.

He then goes on to redefine the concept of value to a more general one. “The determination of the value of an item must not be based on its price, but rather on the utility it yields. The price of the item is dependent only on the thing itself and is equal for everyone; the utility, however, is dependent on the particular circumstances of the person making the estimate. Thus there is no doubt that a gain of one thousand ducats is more significant to a pauper than to a rich man though both gain the same amount.”

He then goes on and postulate that “it is highly probable that any increase in wealth, no matter how insignificant, will always result in an increase in utility which is inversely proportionate to the quantity of goods already possessed.” That is, he not only presented the notion of declining marginal utility but also proposes a specific functional form [3], namely

$\displaystyle du = x^{-1}dx \Longrightarrow u(x) = \ln (x),$

hence the logarithmic utility function. The conclusion is then that a decision must be made based on expected utility rather than on expected value.

Practical applications

The paper also provides an interesting overview of the applicability of the notion of declining marginal utility. For example, in gambling he concludes that “anyone who bet any part of his fortune, however small, on a mathematically fair game of chance acts irrationally”, since the expected utility will be smaller than the original sum of money possessed by the gamblers. He also proposed an exercise to inquire how great an advantage the gambler must enjoy over his opponent in order to avoid any expected loss. His result also shows mathematically the widely acceptable fact that “it may be reasonable for some individuals to invest in a doubtful enterprise and yet be unreasonable for others to do so”.

Using a merchant example, he computes how much wealth one should have to abstain from insuring his assets, or else what is the minimum fortune a man must have to justify offering insurance to other. Again, due to a declining marginal utility, one acts rationally by buying an insurance for a premium that is higher than the expected value of the transaction (risk aversion), a situation commonly seen in practice (otherwise insurance companies wouldn’t make money).

He also demonstrated mathematically the benefits one gets by investment diversification. And if all these were not enough, his ideas shed light on the St. Petersburg paradox.

Conclusion

Although written in ${1738}$, Daniel Bernoulli’s paper on utility theory is amazing and continues to be relevant today as it was back in the ${18th}$ century. It proposes the idea of declining marginal utility as well as a functional form to it, namely the logarithmic utility function. He applies his ideas to gambling, insurance and finance and give you a feeling that the paper could have been written today. Well worth the reading.

References:

[1] Bernoulli, D. (1954). Exposition of a new theory on the measurement of risk. Econometrica: Journal of the Econometric Society, 23-36.
[2] MacLean, L. C., Thorp, E. O., and Ziemba, W. T. (Eds.). (2010). The Kelly capital growth investment criterion: Theory and practice (Vol. 3). world scientific.
[3] Lengwiler, Y. (2009). The Origins of Expected Utility Theory. In Vinzenz Bronzin’s Option Pricing Models (pp. 535-545). Springer Berlin Heidelberg.