On any given day, ask the man on the street where he thinks the stock market is headed. He'll most probably say, he doesn't have a clue and he would rather toss a coin to find out. To the man on the street, it's always 50:50 chance of the market going up or down.

The other day we had a very interesting discussion about entropy in one of our CFE classes and how it can be interpreted within the context of financial markets. The fundamental question was whether an equi-probable tree (two states with each having 50% probability of occurrence) for modelling the random move of an asset price is the only reality that matters; because, only an equi-probable tree is in mathematical equilibrium. Any attempt at making these probabilities different from 50% contravenes the law of maximum entropy.

Entropy is the measure of total randomness in a system. In more understandable term, entropy is the measure of disorder in any isolated system. This is more in keeping with the original definition of the term in physics which says that "entropy" is the measure of the random ways in which an isolated system can be organized.

Probabilistically, entropy is defined as the measure of the state of knowledge. This means, entropy measures how much information is there in the system. If there is too much information, i.e. too much entropy, then there would be too much uncertainty. Therefore, probabilistically, entropy would measure the uncertainty - as far as the information content is concerned - in the system. This is the interpretation that is used both in computer science (information theory) and quantitative finance. Entropy in information theory is also called the Shannon entropy.

If is the probability of the occurrence of a random event (or, if say, there is only a single particle in a system) then the entropy of the event (or of the system) is given by:

In information theory, where we generally talk about binary entropy function (a system can exist in only one of the two states, 1 or 0, "yes" or "no", "electrical charge" or "no electrical charge") the logarithm is taken to base 2.

If there are many particles - equivalent to many assets in a financial system - in an isolated system then the entropy function, which expresses the total randomness of the system, can be expressed as:

It is the universal law of nature - whether we are talking about physics, computer science, social or biological systems - that entropy of an isolated system always increases in a such a way that when the system attains equilibrium the entropy is maximum.

All systems, including financial systems (financial markets, asset markets, etc.) move towards equilibrium. Therefore, it is a natural tendency of the system

Say, there is only one stock in the system and the stock's movement is a binary function, i.e. the stock can either go up or down in the next period. The probability of the up move is and that of the down move is . Therefore, the entropy of the system will be:

To find the value of , the probability of the occurrence of the random event (the up move of the stock price), we need to take the first mathematical derivative of the above expression and equate it to zero.

Differentiating the above expression by parts and equating it to zero gives:

Therefore, if a stock (asset) price is modelled as a binomial system (a binomial tree) whereby there are two states of movement (up and down move) in the next period then for the system to be in equilibrium the probabilities of the up move and the down move have to be equal to 50%.

Any comments and queries can
be sent through our
web-based form.

__
More on Quantitative Finance >>__

back to top