Monday, November 22, 2010

are entropy and risk related?

A friend of mine who is a plasma physicist posted a personal version of his presentation on the nature of the instability generated by a Maxwell's Demon wire-array. Well who doesn't like Maxwell's Demon, the little imp who goes around subverting Newton's Second Law of Thermodynamics by decreasing entropy?

Well since my friend's version had a lot more physics humor in it, I said he had invented a new field: "stand-up physicist" (although some may claim Feynman has first dibs on that)... my friend went on to say that next time he should relate the demon size collapse to our financial cycles. Heh heh...

But wait... I think that implies a really interesting question: is the concept of risk (including financial risk) somehow related to entropy?

In a time when our markets are being determined more and more by the math of physics and information theory, the idea that lowering financial risk is somehow akin to lowering entropy would be a very deep insight into the limits of a financial system.

Think about it. So far, Maxwell's demon hasn't beaten the 2nd law on the large... you may lower local entropy, but in the large, things always bounce back to a net entropy increase. Sound familiar? Markets and quants may be able to locally lower risk through use of financial derivatives, but ultimately, in the large, the markets always bounce back.

Wow. I just googled for "risk entropy" and apparently people in the field are already well aware of the connection. Well, even if it's not original, it's still a fascinating relationship.

Actually, this source summed it up great (duh!):

Any project, large or small is associated with expected and unexpected problems. The analogy mentioned above could be derived from the Second Law of Thermodynamics. The Second law of thermodynamics deals with a concept : Entropy. Entropy, in short, is the amount of disorderliness of the system. Entropy is also a measure on the information contained in an system. In information technology, entropy is considered as the amount of uncertainty in an given system. This has a defined relation, "As the amount of information increases, the disorderliness of a system (entropy) decreases".