[ weird things ] | do trading algorithms create a shadow market?

do trading algorithms create a shadow market?

High frequency trading algorithms are creating market patterns humans can barely follow. As they get even faster, the market may be under their control more often than it's under ours.
stock market

Your money is at the mercy of an enormous financial beast that even its human handlers can’t understand as it creates a complex shadow market almost undetectable to even the most experienced bankers and brokers, going through its bust to boom to recovery cycles within minutes if not seconds. Countless trades are within a machine’s control now and no one knows what will happen as we continue to accelerate them even farther. It just makes you want to shudder and turn a fearful eye towards your new algorithmic financial overlords, right? Well, this is the impression you would get reading a Wired Science post on a paper regarding the trades by financial algorithms involved in high frequency trading, or HFT. In reality, we actually do have a decent grasp of what these algorithms do and even a clue as to how they interact so the trades being made are not random and made by programs acting in a completely unpredictable fashion. No, the problem is that they’re going too fast for humans to intervene, happening at 940 milliseconds or faster, and because they all employ extremely similar strategies, their boom-bust-recovery cycles, or flat crashes, happen before humans even know it and to prevent this digital herd behavior, we either need humans to intervene, or introduce new algorithms.

First and foremost, let’s establish why anyone would want to make a trade at a 940 millisecond interval. While a stock may barely move between market open and market close, its daily fluctuations could be quite high as buyers and sellers push it back and forth throughout the day. Some stocks could swing wildly, others move a few pennies here or ten cents there. An extremely fast algorithm, or algo in Wall Street parlance, could swoop in and make a myriad of trades as the stock fluctuates, taking advantage of as many transactions as possible by hitting the peaks and valleys at the right time. Forget rumors or even company fundamentals. This is about riding the wave of fluctuations, nothing more. But what if a lot of algos are attracted to a particular set of stocks displaying high volatility that day? They would either trigger massive buy-ins or sell-offs, then attempt to fix the situation by feeding on their self-created volatility. Usually this happens very quickly and at a small scale, but there have been cases where these events really registered. In May of 2010, they swung the Dow Jones by an astounding 1,200 points in just 25 minutes, leaving humans wondering why the index fell by about 600 points only to surge back by the same 600 or so points almost immediately afterward. The post-mortem by the SEC and CFTC pointed the finger squarely at HFT algos creating the volatility loop I just described…

The combined selling pressure from the Sell Algorithm, HFTs and other traders drove the price of the E-Mini S&P 500 down approximately 3% in just four minutes, from the beginning of 2:41 P.M. through the end of 2:44 P.M. During this same time, cross-market arbitrageurs who did buy the E- Mini S&P 500, simultaneously sold equivalent amounts in the equities markets, driving down the price of SPY (which represents the S&P 500 index) also down approximately 3%. Still lacking the sufficient demand from fundamental buyers or cross-market arbitrageurs, HFTs began to quickly buy and then resell contracts to each other — generating a “hot-potato” volume effect as the same positions were rapidly passed back and forth. Between 2:45:13 and 2:45:27, the HFTs traded over 27,000 [futures] contracts, which accounted for about 49% of the total trading volume, while buying only about 200 additional contracts net.

What’s more, the regulators even know whose algos triggered the sell-off: Waddell & Reed Financial Inc., a financial planning firm that sold off too many futures contracts. The rest of the algorithms followed suit and as they sold half the futures available for trading that day in 14 seconds, their actions cascaded across the entire market so by the time that humans knew something went very, very wrong, the index plunged. By the time they started to try and recover the market, the algorithms drove the now undervalued contracts back up. This is the herding behavior that scares the authors of the arXiv paper who warn that these flat crashes are not good and there’s no way to guarantee that another such sudden 600 point dip will be recovered were it to happen again. And as more and faster algos come online, some targeting the 740 millisecond and faster realm, bigger dips are not out of the question. At this point, the paper and the post covering it part ways, with the post veering into hyperbole, quoting an unrelated researcher in the UK saying that we have no theoretical understanding of the HFT algorithms’ work at the tiny time spans. But we really do, and we know what they target and that they don’t perform a lot of deliberate computation since to execute a trade in under 940 ms, they can’t afford to do it.

The paper’s solution to market flat crashes involves introducing new strategies for these algorithms to disrupt herd behavior, to start buying instead of selling just to goad them into doing something new. However, I would be curious to see who would be willing to add even more volatility into the market because now we’d have the random agent of chaos roaming through the system, potentially making things worse since these HFTs have real effects on the slower, human-paced market. Between the interaction of these algorithms with each other, human interference from above, and a random algorithmic agent breaking up otherwise predictable machine trades, who knows what kind of trouble we might trigger? After all, the market often works on fear, greed, hype, and rumor, hardly the harbingers of logic and precision, and not exactly good ingredients for stability. Maybe a requirement for slower algorithms can be a good thing because it would allow for greater diversity in how they make their purchases. When the target is speed, a lot of conditionals, lookups, and validation is thrown away since they all add time complexity. When the goal changes to making the best trade choice, these algorithms can now be equipped with more logic. And very importantly, they will stay non-random and predictable.

# tech // algorithms / market / stock market / trading

  Show Comments