Challenges around AI for electronic trading and assuring performance of next-gen infrastructure dominated panel discussions at last week’s New York event.
The STAC Summit in New York provided a fascinating progress report on how electronic trading is making use of AI, and its perpetual pursuit of faster infrastructure to reliably deliver data and to drive competitive advantage.
We are at a point where financial firms are far enough down the road on infrastructure projects to assess the challenges that come with AI. The goal is to leverage machine learning (ML) and deep learning (DL) to improve outcomes or automate functions, but it’s not without its challenges, according to Dr. Michel Debiche, Director of Analytics Research at STAC.
Michel kicked off the event with a discussion on progress toward benchmarking standards for ML/DL techniques and technologies, and highlighted a couple of obstacles. While algorithms and advanced technologies like image recognition have enabled ML to extract meaningful signals from noise, it faces some extra challenges when applied to financial markets.
Almost by definition in trading, the search is for novel signals in market data and market-related information, clues on where the market is heading based on a combination of unknowns and hard-to-measure sentiment. Michel warned of the risks of training algorithms to misinterpret information and end up reacting to market data that is spurious and misleading.
A related challenge not fully explored during the discussion, is the quality and completeness of the data being delivered to these technologies. Missing or stale information can just as easily taint algorithmic results. As live market data and alternative data feed volumes grow, so does the need to assure timely and complete delivery of increasingly complex data services. More on that later.
A further challenge is the need for explainable AI, already a GDPR requirement that demands companies are able to show how algorithms arrived at their conclusions – not easy with ML and DL neural networks that are designed to learn by themselves without human intervention.
These challenges are compounded by skills shortages. A panel on “AI and the Engineer” observed that ML was a team sport where recruiting all the players was proving tricky. While it was acknowledged that many of the required skills already exist in the finance industry – from managing large amounts of data to designing high performance systems – the missing piece was usually around data science.
The panel talked about recruiting elusive ‘unicorn’ employees, in-demand data scientists, the uniquely talented and motivated people best equipped to explore the possibilities that AI opens up. One of the big questions on the day was how to get them, keep them, and what kind of team you will need to build around them.
Success in the AI arena also depends on engineering discipline, having technical people who understand the crucial technologies, not just today but what’s coming down the track.
There is also the challenge of assuring data delivery for those projects – which is where Corvil comes in. The drive for shorter time scales around data capture shows no signs of abating. What was once a conversation with clients about microseconds is now about nanoseconds and even picoseconds – one millionth of one millionth of a second!
I spoke on a panel – “Time Sync and Capture in 2019” – that explored how higher bandwidth (e.g., 40GB and 100GB) being deployed to handle higher message volumes is driving a new generation of timestamping, capture and analytics technologies. It was timely from Corvil’s perspective; we’ve just launched our Corvil 9000 appliance and there was an opportunity to discuss how the architecture delivers a new network traffic capture and analysis standard.
The Corvil 9000 leverages RAM and the best of both classes of storage. We use memory for deep buffering and highly durable spinning disks to achieve more than 40Gbps sustained capture without compression.
By switching over to solid state disks (SSDs) for our Turbo capability (available later this summer) establishes a world’s first 60Gbps capture rate sustained for 30 minutes. Perfect for capturing and analyzing peak volumes during market open and close. As a result trading infrastructure operations teams can ensure that market data and other feeds are rapidly and reliably delivered, including in some cases to the AI-based technologies searching for opportunity signals.
Find out more about Corvil Appliances here.
Learn how a leading quantitative market maker assures market data and news feed delivery with Corvil.