REDWOOD CITY, Calif. — The financial networks have always led the innovation in data analytics and technology adoption, and it’s no surprise that they are creating one of the biggest of the big data challenges when it comes to processing and moving around data.
Steve Hirsch, the chief data officer at Intercontinental Exchange, was a fountain of statistics in a talk on financial data at VentureBeat’s DataBeat Data Science Summit today. Intercontinental handles a ton of data for the New York Stock Exchange as well as 22 other real-time financial exchanges around the world.
[aditude-amp id="flyingcarpet" targeting='{"env":"staging","page_type":"article","post_id":868200,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"big-data,business,entrepreneur,","session":"C"}']The networks are now providing 14 billion stock quotes in a single day for U.S. equity options markets. Forty-eight huge network pipes supply that information, running constantly at 5 gigabits a second. Those networks deliver 10 million messages per second, and latency, or delays in interaction, are measured in microseconds. Intercontinental Exchange has to deal with 7 TBs of fresh data a day. That’s what it takes to be in touch with the market today, Hirsch said.
He said we’ve come a long way since the creation of the first telegraph-based ticker tape machine in 1867. The financial industry has led the way in innovation in data analytics. It introduced Quotron stock ticker machines that were in films like Wall Street, a 1987 movie which starred Charlie Sheen. Stock exchanges had a monopoly on trading price information, and even sources such as Google and Yahoo had access to data that was delayed 15 minutes. That was true as little as five years ago.
AI Weekly
The must-read newsletter for AI and Big Data industry written by Khari Johnson, Kyle Wiggers, and Seth Colaner.
Included with VentureBeat Insider and VentureBeat VIP memberships.
But the financial industry and the U.S. Securities and Exchange Commission recognized that the democratization of data would lead to more fair trading and, as a consequence, significantly more trading of stocks.
In 1998, the SEC introduced Regulation ATS, which permitted new trading exchanges to compete with NASDAQ and the New York Stock Exchange. Those exchanges armed themselves with servers and open systems that could scale horizontally.
“It was the rise of the machines,” Hirsch said.
The change was dramatic. Hirsch got into the trading scene with Archipelago, a company that united various “islands” among the exchanges and helped democratize data. Over the past decade, daily trading volume peaks have increased 1,000 times. That’s a consequence of the freeing up this information.
In 2007, the SEC introduction Regulation NMS. That enabled the connection of a dozen trading exchanges, which have now multiplied to 40. That introduced the complex problem of making sure that prices were consistent in real-time across 40 different data sets and that the trading was fair.
“The complexity that came with this was unintended,” Hirsch said. “How do you [watch] a market with 40 different places to trade?”
[aditude-amp id="medium1" targeting='{"env":"staging","page_type":"article","post_id":868200,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"big-data,business,entrepreneur,","session":"C"}']
The SEC’s regulation NMS is still in effect today. The easy thing about financial trading data was that it was based on structured data, or a bunch of numbers. But now the most sophisticated traders are building their predictive models with historical data and then extending them with real-time sensors for new kinds of data inputs. That means they are doing sentiment analysis on Facebook, Twitter, and traditional news sources.
“That feeds into the machines that trade all day long,” Hirsch said.
The system is not perfect, Hirsch said, as evidenced by the “flash crash” that happened on May 6, 2010. On that day, the Dow Jones Industrials plummeted 9.2 percent in a matter of minutes, with no clear reason except a panic that propagated through automated trading systems. It happened again in April when hackers took over the Associated Press Twitter feed and planted a hoax that there were explosions in the White House and the president was hurt. The AP immediately disavowed the tweet, but it was retweeted 3,000 times. The Dow Jones Industrials fell for 9 minutes before recovering.
Those automated crashes created fear among investors, so the traders have now built limits into the system to pause trading during big falls.
[aditude-amp id="medium2" targeting='{"env":"staging","page_type":"article","post_id":868200,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"big-data,business,entrepreneur,","session":"C"}']
“We have seen what happens when you let markets run,” Hirsch said. “We believe it is too complex. There are now data driven safety nets that stop flash crashes. They introduce a pause in trading.”
Monitoring the 23 networks is the full-time job of 200 data science professionals. Intercontinental Exchange’s data science product, Data Dispatch, is marketed by Pivotal.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn More