Predicting “Flash Crashes”
A controversial financial market indicator may be able to prevent short-term crises in the modern computerized trading world.
(originally published by Booz & Company)Shortly after 2:40 p.m. on May 6, 2010, a downbeat day for U.S. financial markets turned chaotic. Prices for the Chicago-based E-mini S&P 500 futures—the most liquid equity index contract in the world, with US$140 billion in average daily volume—fell rapidly. Before the Chicago Mercantile Exchange’s (CME’s) automatic stabilizer was triggered at 2:45 p.m., momentarily pausing trading, the Dow Jones Industrial Average had lost about 600 points (and was down nearly 1,000 for the day). After trading was allowed to resume, the markets recovered and regained most of the points within 20 minutes.
The so-called flash crash of 2010 was the second-largest point swing, following the largest one-day point decline, in the history of the Dow. The crash seriously damaged the confidence of investors, who withdrew $19.1 billion from domestic equity funds that month—the highest monthly outflow since the peak of the financial crisis in October 2008. It also raised questions about the reliability and effectiveness of today’s financial market indicators, as well as the increasing role played by high-frequency traders, whose computerized programs have reduced transaction times to thousandths and even millionths of a second. (It’s been estimated that as of 2010, high-frequency transactions account for about 70 percent of U.S. equity trading, and the majority of U.S. stock market transactions.)
In the end, a joint investigation by the Securities and Exchange Commission (SEC) and the Commodity Futures Trading Commission reported no evidence of wrongdoing. It concluded that the crisis began when Waddell & Reed Financial Inc., based near Kansas City, tried to aggressively hedge its investment position by selling $4.1 billion in futures contracts in 20 minutes. This statement was met with widespread criticism; the CME itself issued a rare press release, expressing skepticism that one trade could have had so many ripple effects. Among the culprits that various experts have blamed are managers at Waddell & Reed, the technology, the stock market’s structure, and the evolution of the hedge fund industry.
But the flash crash didn’t just trigger a wave of recrimination. It led to a development with potential lasting impact on financial markets: A new proposed metric for future volatility, which its proponents say can be used to prevent short-term crashes in the modern computerized trading world. Such crashes occur suddenly and quickly spiral out of control. This new metric, volume-synchronized probability of informed trading (VPIN)—if it works as its creators claim—could become a crucial mechanism that uses probability analysis on past trading behavior to monitor imbalances in trading, predict future behavior, and alert traders when a crisis is imminent. Proponents say this warning signal would enable analysts or regulators to slow down or halt trading before getting sucked into a crash. Yet even if its critics are right and the metric isn’t reliably consistent, the debate over its usefulness has further demonstrated the need to address the vulnerabilities inherent in the financial system.
VPIN is designed to calculate “order flow toxicity,” or the probability that informed traders (such as hedge funds, which tend to all buy or sell at the same time) are taking advantage of uninformed traders (such as market makers, who typically lose money when order imbalances occur—that is, when there are so many buy or sell orders for a specific security that it becomes all but impossible to match all the orders). It uses real-time statistical analysis of trading behavior to estimate the relationship between informed traders’ orders and how much liquidity market makers are providing. If the order flow becomes too toxic from informed traders’ activity, electronic market makers will stop supplying liquidity to help stem their losses, which can create a cascading effect on other market makers and trigger an avalanche of similar withdrawals—the sort of frenzied activity that precipitates a flash crash.
The metric was developed by David Easley and Maureen O’Hara, economists at Cornell University, and Marcos López de Prado, head of high-frequency trading research at the Tudor Investment hedge fund. The trio have filed a patent, and have urged regulators to use VPIN as a watchdog. “Some traders are going to have superior information, for a variety of reasons,” says Easley. “And if you have people with good information, they tend to buy. It becomes imbalanced on the buy side. From the point of view of a market maker, that’s toxic, because the market maker’s job is to provide liquidity.” Using VPIN, Easley and his team retroactively calculated that the stock market registered some of the highest readings of toxicity in recent history an hour before the flash crash.
Not everyone agrees that VPIN will be the market’s savior. The most prominent critics are Torben Andersen, a professor at Northwestern University’s Kellogg School of Management, and Oleg Bondarenko, a professor at the University of Illinois at Chicago. In a paper published in October 2011, they argued that “our empirical investigation of VPIN documents that it is a poor predictor of short run volatility, that it did not reach an all-time high prior [to], but rather after, the flash crash, and that its predictive content is due primarily to a mechanical relation with the underlying trading intensity.”
One problem, they note, is that VPIN conflates trading volume and time: Because trades are grouped sequentially and then by regular clock time, the delineation between two days’ trading sessions is unclear. In short, they argue, VPIN is “highly dependent on when exactly you start counting trades. If you start counting one day later than someone else, your groups will contain different trades and your VPIN will be different.” Using a slightly different data set (which they argue is more accurate historically), the two researchers had to start in 10 to 15 different places before they could replicate VPIN’s results. The implicit concern is that VPIN could dupe investors and analysts into a false sense of security.
Andersen says he has no desire to get into a mudslinging match, and that his work on the subject is ongoing, but that he has “accumulated additional strong evidence that VPIN is not working as advertised.” Easley says his group’s research on VPIN also continues, and that Andersen and Bondarenko simply performed a fundamentally “different analysis—reasonable, but different.”
Meanwhile, regulators are increasingly concerned about their ability to keep up with the trend toward ultra-fast trading; in the spring of 2012, the SEC went before Congress to ask for a 2013 budget increase of $245 million, largely to protect investors and to “strengthen oversight of market stability, and expand the agency’s information technology systems.” Meanwhile, the SEC has taken note of VPIN’s possibilities. In late 2011, the agency assigned a group from the University of California at Berkeley to investigate VPIN’s promise, and in a working paper, the Berkeley group wrote that VPIN did “indeed give strong signals ahead of the Flash Crash event on May 6 2010. This is a preliminary step toward a full-fledged early-warning system for unusual market conditions.”
Regulators and industry groups are also taking steps to humanize the world of computerized trading. In June 2012, a group of 24 brokers and traders sued CME Group, which owns major commodities exchanges in New York and Chicago, in an attempt to overturn new rules that cater to high-frequency traders. And in July, the SEC approved a new rule that will require exchanges and the Financial Industry Regulatory Authority to jointly devise a plan for the development of a consolidated audit trail, which would track every order, cancellation, modification, and execution of a trade for all listed equities across all U.S. markets.
Yet just one month after the SEC took this step, a mini flash crash occurred when Knight Capital Group Inc., whose market-making division handles about 10 percent of U.S. equity volume, lost $440 million and saw its stock plunge more than 70 percent after a “software glitch” that dumped a huge number of orders into the market. “Those kinds of things are inevitable because they’re computer programs,” Easley says. “There’s not going to be a perfect one. The question is, What are your controls?”
Several countries around the world are cracking down on high-frequency trading, most recently Australia, Canada, and Germany. But because this form of trading is still dominant in the United States, and likely to remain a key feature of the U.S. financial markets for the foreseeable future, the need for a strong indicator to keep the system in check and protect investors will only continue to grow. The debate over VPIN may be taking place largely in the ivory tower, but if the metrics of probability can truly tame the dangers inherent in computerized trading, then the potential consequences reach far beyond.
Reprint No. 00144
Author profile:
- Matt Palmquist is a freelance business journalist based in Oakland, Calif., and the author of s+b’s Recent Research column.