# Profitability and Information

In the previous post (Optimal Market Exposure), we saw how trading requires information superior to the “outside world’s belief”, in order to be profitable. The trader may obtain this information by analyzing market fundamentals or technicals, or both.

It is actually obvious that better information facilitates better trading, and more profits, but here we will prove an exact equivalence between profitability and information. Specifically, we will prove that, for a standard fixed SL/TP trade, the optimal expected utility gain is equal to the information gain (Kullback Leibler divergence) relative to the “outside world’s” belief. In simpler words:

• 1 bit of “inside information” can double the trader’s account. (21)
• 0.5 bits can give the expected utility growth corresponding to 41% account growth. (20.5)
• 0.1 bits can give the expected utility growth corresponding to 7% account growth. (20.1)
• etc.

## Derivation

As in the previous post

• we will consider the stereotypical trade, with two possible outcomes: SL and TP.
• p is the probability of a win, conditional on the efficient-market hypothesis. (i.e. according to the “outside world’s” belief).
• b is the probability of a win, according to the belief of the trader.

In the previous post we mentioned that the efficient-market hypothesis (according to which the market exhibits a completely random walk) means that $\large{(1-p):p=T:S}$

• Let r be a variable proportional to the investment made (exposure taken) for the trade, such that a win amounts to r(1-p) of account size and a loss amounts to rp of account size. (In terms of the previous post’s variables r=fS/pm, but this is irrelevant.)

Now, a random coin toss (which has 50-50 odds) generates one binary digit (bit) of Shannon information on each toss. If it is known that the coin is biased, then a reduced amount of information is generated because a fraction of that information is known before the coin is tossed.

Prior information = total information (1 bit) – toss information (entropy)

In the case of our trade the prior information is given by: $\large{I_p=\log(2) -p \log(\frac{1}{p}) - (1-p) \log(\frac{1}{1-p})}$ $\large{I_b=\log(2) -b \log(\frac{1}{b}) - (1-b) \log(\frac{1}{1-b})}$ $I_p$ is the information known by the “efficient market” (or “outside world”) about the outcome of our trade, at the time it is opened, while $I_b$ is the information known by the trader (if his information is good).

The posterior utility (assuming logarithmic utility) expected according to the trader’s belief, is given by $\large{G=b\log(1+r(1-p))+(1-b)\log(1-rp)}$

By maximizing this w.r.t r (or by substituting the result from the previous post), we find that $\large{r_{best} = \frac{b-p}{p(1-p)}}$

Therefore $\large{G_{max} = b \log\frac{b}{p}+(1-b)\log\frac{1-b}{1-p}}$

which is called the Kullback-Leibler divergence (also called information gain, information divergence, or relative entropy), and measures the information “disagreement” between the trader and the efficient-market hypothesis.

It can also be written as: $\large{G_{max} = I_b - I_p - (b-p)\log\frac{p}{1-p}}$

which shows that it is a close approximation of the difference between the information known by trader and that known by the “outside world”. Close inspection of original result above (the Kullback-Leibler divergence) shows that it is in fact exactly equal to the difference in information, but in the estimation of the trader. This allows us to express the result precisely in non-mathematical (but complicated) English:

## Conclusion

The optimal utility growth expected (by the trader) is equal to the prior information known about the outcome of the trade by the trader (according to the trader) minus the prior information known about the outcome of the trade by the “world”, according to the trader.

In different words:

• isak says: