Robots trading stock market


These models by their very nature rule out crises, and thus are also a poor tool to tackle big changes. One has to move on to an upper level and embrace the superior approach of agent-based models. Unfortunately, mainstream economists and policymakers for that matter are not prepared—and some are even hostile—to acknowledge that financial markets are complex systems. Conventional financial regulation is also hopeless. Experience from the recent crisis suggests it is in the nature of markets that they will tend to innovate around regulations, and the nature of risk taking will inevitably keep changing as financial systems get more sophisticated [ 5 ].

Regulatory frameworks have to adapt to circumstances that are changing too fast for regulation to succeed, and the robots have something to do with this. Thanks to the flash crash of May 6th when the Dow Jones Industrial Average plunged by nearly 1, points in a matter of minutes, the merits of high-frequency trading are under scrutiny by regulators, in particular the Securities and Exchange Commission. The flash crash report [ 6 ] identified that …on May 6, when markets were already under stress, the sell algorithm chosen by the large trader to only target trading volume, and neither price nor time, executed the sell program extremely rapidly in just 20 minutes.

At a later date, the large fundamental trader executed trades over the course of more than 6 hours to offset the net short position accumulated on May 6. The report then concluded that…one key lesson is that under stressed market conditions, the automated execution of a large sell order can trigger extreme price movements, especially if the automated execution algorithm does not take prices into account.

I call your attention for the fact that the SEC does not aim to act pre-emptively, but only to react in the aftermath. Stock exchanges do not publicly release data about these mini crashes, but most active traders say there are currently at least a dozen a day [ 7 ]. In summary, apparently financial regulation in its current format is not working. Bak and Paczuski remarkably observed [ 8 ] that in complex systems, large, catastrophic events occur as a consequence of the same dynamics that produces small, ordinary events.

This statement runs counter to the usual way of thinking about large events. But large dynamic systems naturally evolve, or self organize, into a highly interactive, critical state where a minor perturbation may lead to events of all sizes. Such actual events cannot be predicted, but the statistical distribution of these events is predictable.

As stock markets are viewed as complex systems, the trigger of a crash cannot be synonymous with its cause. A confluence of factors rather than a particular trigger is the proper explanation. Otherwise, the crisis would ever get started. In fact, the flash crash of the DJIA shows the footprints of a complex system at work; I have witnessed this in a study with another couple of students [ 9 ].

And with a colleague we went further [ 10 ]. Thanks to increasing high-frequency trading, correlations previously only seen across hours or days in trading time-series are now possibly showing up in timescales of minutes or seconds [ 11 ].

Despite the fact that financial regulation cannot succeed using the currently widely used conventional tools, in theory one can still rely on the control theory of self organized systems [ 8 ]. This approach has been applied to a number of complex systems, including the attempts to influence the group behavior of cockroach aggregation in shelters.

Engineers have devised autonomous cockroach-robots and relied on self organization as the main coordination mechanism. The controller of individual robots was designed using reactive, behaviorbased techniques.

Socially integrated autonomous robots, perceived as conspecifics by the group of cockroaches and acting as interactive decoys, were able to control their self-organized group choices of shelter. Inspired by this, my students and I then suggested the use of socially integrated robot traders in stock markets to function as an anti-bubble decoy [ 1 ]. We borrowed from such models of cockroaches finding a shelter but also from those of information transfer in fish shoals, and applied their modeling principles to the stock market.

We then replicated the characteristics of actual stock markets with their price dynamics of highs and lows. In the presence of extreme events, these cannot be accommodated with a Gaussian probability distribution. Variances in the real financial world are too high for Gaussian standards. But we were lucky enough to realize that after introducing socially integrated contrarian robots, the stock price dynamics could be controlled, so as to make the market more Gaussian.

This plainly means the bubbles and crashes disappeared. So we think this blueprint for market stability, if correctly engineered, may offer a credible alternative to monetary policy and current financial regulation.

And while monetary policy and standard financial regulation are conspicuous, an extra advantage of contrarian algorithms is their crypticness. We are not selling the final recipe for preventing financial crashes; rather, we are suggesting the ingredients and the type of cake. How to pay for the costs of the contrarian robot-trading system still needs refining. Rise of the machines Eric Hunsader from US data firm Nanex believes robot traders fiddle the market, ordering then cancelling trades just before the critical buying moment.

Nanex reported that the case marked the first time two large trading companies have been in a spoofing dispute, which led the CME, on which the exchange was made, to review its regulations. Concern over the dangers of robot traders has led others to probe, including American Attorney General Eric Schneiderman, who is investigating their potential for manipulation. The Dow Jones Industrial Average dropped by nine percent — 1, points — in the space of minutes.

A high-profile probe by the SEC found that computerised traders were behind the decline. According to Nanex, those moves might have been premeditated attempts at manipulation, although some, including the SEC report, refute the idea. Either way the circuit breakers put in place to prevent such shock incidents failed to act — a worrying indicator of their fallibility. Hunsader says HFTs sometimes, somehow, work outside of the five to 10 percent parameters set by programmers.

Those blunders are likely to continue unless systems and regulators improve, according to Hunsader. Major firms are reluctant to implement that transparency, however, for fear other companies could copy their transaction patterns. That seems to be the view of Wall Street trader-turned-Cambridge University neuroscientist John Coates, who explores the risk-taking element of trading and its physiological effect in his book, The Hour Between the Dog and the Wolf.

He writes that the biological response to risk-taking impairs human judgement, causing jumps and crashes in the stock market. Computers should theoretically be able to stabilise that, evading the problems human activity entails — but incidents like the Flash Crash suggest the contrary.

Insufficient replacements What robot traders do evade are the human-specific elements that have for so long been fundamental — and beneficial — to trading. As human trader control wanes and IT personnel monitoring the algorithms take over , so too does conscious risk-taking, decision-making and intuition, which computers simply cannot mimic.