Risk Latte - Understanding Entropy – A Key Concept in Physics and Finance

Understanding Entropy – A Key Concept in Physics and Finance

Team Latte
April 19, 2012

The concept of entropy is related to that of probability. Entropy can be defined as the measure of "disorder" in a system. A system with high order has low entropy whereas one with a low order (disorder) has high entropy. Entropy is a fact of life and nature. It is found in all physical, natural and financial systems. For example, when we get older the entropy increases; when a star like our sun burns out and becomes a red giant or a white dwarf the entropy increases; when the pollution in our environment increases, the entropy increases.

Ludwig Boltzman was the first to derive a mathematical expression for Entropy. However, to understand entropy fully one needs to understand a "microstate" and a "macrostate". A macrostate denotes the gross, overall description of a system, say our body. And, a microstate refers to the specific arrangement of the particles of that system, in this context it will be the cells of our body. In physics, a gas inside a container is a macrostate and the particles – the molecules – of the gas comprise the microstate. In the context of finance, or a financial system, such a characterization of a microstate and a macrostate may not appear that intuitive, but it still holds true. A stock price can be considered to be a microstate and the option price or the derivatives based on that stock can be seen as the macrostate. Or, financial assets traded in a financial markets may be considered to be the microstates and the given financial market (asset market) the macrostate. A macrostate subsumes a microstate and there can be, and, in most systems, there are many microstates in a single macrostate.

Thermodynamic entropy of a macrostate is simply defined as the natural logarithm of the number of microstates consistent with this macrostate. If is a “macrostate” and as the “microstate” then the number of microstates in a given macrostate can be found out using the following formula for Combinations:

The sign “!” in the above formula denotes factorial where, and and so on. The above formula simply states how many ways are there to make choices from objects. For example, in how many ways can we form a 3 member team from 12 candidates? The answer is 220. Or say, if there are 200 coins then in how many ways can we get one (1) head? The answer is 200 (because all 200 coins can result in a head once)

Boltzman defined (in fact, he derived this expression) entropy as:

Or, in other words, we can write the expression for thermodynamic entropy as:

The constant, , is known as the Boltzman’s constant and is an extremely small number.

Say, we have a system where there are 200 coins. This is the macrostate, whereas the combination of number of heads and/or tails in all these 200 coins is a microstate. Now how many ways can we get 1 “head” out of the 200 coins in a single toss? The answer is: . If for the sake of simplicity, if we assume the value of the constant, , to be one (which is not true, as is a very small number and is equal to ), then the entropy of this system – with the possibility of one head out of 200 coins – is given by:

Now, if we want to find the entropy of the system where we want to have say, the 25 “heads” out of 200 coins in a single toss, it would be given by:

The entropy increases when we increase the number of “heads” from 1 to 5 in a system of 200 coins. If we continue the above exercise by finding out the entropy of the above system consisting of 200 coins and we increase the number of heads from 5, 10, 20, 100, 150, 175, 195 and all the way up to 200, we will notice an interesting fact: the entropy increases and then decreases when we increase the number of “heads” in a single toss in a 200 coins system.

How is Entropy related to Probability?

In physics, probabilistically entropy can be defined as the measure of the random ways in which an isolated system can be organizedEntropy of a system comprising random events and probability of those random events is related. Entropy can also thought of as the measure of total randomness in a system.If is the probability of the occurrence of a random event then the entropy of the event is given by:

In information theory, entropy is defined as the measure of the state of knowledge. Entropy measures how much information is there in the system.

If there are many particles – equivalent to many assets in a financial system – in an isolated system then the entropy function, which expresses the total randomness of the system, can be expressed as:

The above expression gives the relationship between Entropy and the probability


Reference: Excel for Engineers and Scientists, .S.C. Bloch, John Wiley & Sons, 2003


Any comments and queries can be sent through our web-based form.

More on Quantitative Finance >>

back to top

Videos
 
 

More from Articles


Quantitative Finance