Super-fast computers continue to increase their role in financial markets.
They first came into prominence in H2 2009, when the ‘correlation trade’ began. Their role is nothing to do with price discovery, the traditional market function. Instead, they trade on algorithms.
Their aim is trade arbitrage opportunities between markets on a nano-second by nano-second basis. These discrepancies are incredibly small, so they make their returns by leverage, trading millions of contracts at a time.
18 months later, the computers now dominate many markets. The above chart from Petromatrix details the position during Q4 on one of the biggest US exchanges, the Chicago Mercantile Exchange (CME). It shows the Algorithm computers traded:
• ~70% of equity volume (red column) and 50% of all contracts (blue)
• ~50% of all energy volume, including oil, and over 30% of contracts
• ~80% of all foreign exchange (Forex) volume
The problem, of course, is that the correlation trade creates a situation where no single market knows what it is trading. Fundamentals of supply/demand become irrelevant, particularly if central banks such as the US Fed make available vast quantities of liquidity through QE2 to fund the trading.
Fundamentals will, of course, become important once the Fed withdraws its QE2 liquidity. When this happens, probably in Q2, we may well see considerable re-pricing take place, as physical market conditions suddenly become important once again.