Skip to content
Use the search to find information about InterSystems products and solutions, career opportunities, and more.

Infrastructure Amidst Market Volatility - Part 2: A New Foundation

View of One World Trade Center in Manahattan from a city street

Editor’s Note: This is the second in a two-part series by leading global capital markets expert Larry Tabb.

In part one, “A Strange and Uneasy Time,” we discussed how the volatility related to the COVID-19 pandemic put a strain on the capital markets infrastructure. Now, we discuss how a new foundation is a requirement to survive and thrive in the “new normal.”

The bedrock of modern financial markets’ infrastructure comprises four key elements – data ingestion, analysis, decision making, and processing – as technology plays an increasingly larger role in investing and trading. Although this is nothing new, the level of information that needs to be processed has become massive; it will grow exponentially as more products become electronic, more decision-making data needs to be analyzed, more transactions are generated, and the complexity of those transactions increase.

The capability of firms’ infrastructure also needs to become more scalable. Capabilities need to scale up and down in cloud-like facilities able to process abnormally peak loads that can run 10x the normal volume while also being able to scale down so normal server capacity isn’t running at only 5 to 15% — otherwise the cost of supporting this infrastructure will outstrip revenues.

This is not a simple task given firms are committing capital based upon data arriving in bursts of over 40 million messages per second as they make real-time trading decisions across multiple markets. And as the complexity of trading decisions increases, this will only become more difficult.

Overwhelming trading volumes impact much more than exchange and client-facing linkages. It is not just real-time trading decisions that are critical. Some of the more complex and difficult challenges occur once the trade is executed, when the cost of an error can be exorbitantly high. It’s in the mid and back office where the shares and cash of executed orders need to be moved. And it’s where processes such as order validation, average pricing, allocations and confirmations, and the distribution of information to clients, custodians and clearing entities takes place.

Even though these processes may not be latency-sensitive, the increasing linkages between client and firm have become intertwined through the use of electronic, internet, and mobile technologies that tie together customers, counterparties, exchanges, clearing firms, and service providers.

Even with the vast array of algorithmic, online, and mobile interfaces, much of the mid- and back-office systems are not online and real time. Many firms today still run batch-oriented core processing systems. Although these systems may not be visible to the outside, these batch systems are responsible for the bulk of firm’s clearing, settlement, money, and securities movement and control, risk management, and accounting functions.

The challenge with older batch systems is that many are volume constrained. Overwhelming volumes can lengthen the batch cycles of older processing platforms, causing them to crash and/or delay the start-of-day updates of many real-time systems. If these back-office systems fail, they may cause operational and regulatory challenges, which could impact clearing, settlement, and accounting operations. Still more, this could cause downstream problems that could take many days, months, or even years to resolve.

What we have discovered from the volume and volatility of the last few months is the importance of being able to process more data and analytics with increasing precision. Unfortunately, this has come at a time when decreasing asset management fees and the shift to lower-cost asset management products are applying pressure to all financial firms. These seemingly contradictory requirements – to improve processing capabilities while cutting costs – are forcing firms to invest. Those that can effectively respond to these challenges will survive — unfortunately the ones that can’t, may not.

The shift to electronic trading has made capital markets firms rethink their data foundations. If their foundation isn’t built on bedrock, they could crumble under the weight of volatility and volume. But what is this bedrock? And what is needed to create a stable and resilient foundation?

As we move into this new age, firms need to onboard data as quickly, efficiently, and in as large a scale as possible. This was historically done by creating individual application programming interfaces (APIs) to traditional technology and service platforms. Increasingly, though, these can become overwhelmed by data tsunamis, and data queues can cripple firms’ databases and create logjams.

For example, we see firms incorporating data fabrics that have a larger ingestion pipe and can host data in a more widely accessible manner than a traditional serial API. Data fabrics, by nature of their in-memory architecture, can onboard and process data faster, scale more easily, and allow the data to be accessible to a more comprehensive group of users.

However, data fabrics aren’t all created equal. Some are bound by the amount of available memory. In today’s new and more challenging environment, data fabrics must not only provide an elastic data layer to supply the relevant data to business critical applications, they must also provide durability to eliminate data loss and ensure reliability in case the available memory is exhausted. This is especially critical during times of high and unexpected levels of volume and volatility.

In addition, firms are seeking better ways to harmonize the data they ingest. They’re using better engines and data augmentation capabilities. They’re employing analytics and visualization frameworks and a better transport layer. All this enables them to get data and transactions more efficiently and effectively into the appropriate mid- and back-office platforms.

Without this type of infrastructure, things will stay clunky. Developers will need to continue creating bespoke APIs and handcrafting every linkage, connection, and service needed – not only to onboard information, but also to mine, analyze, and act on trading data and process those transactions effectively.

In this period of uncertainty and volatility, the firms with solid foundations and the most efficient infrastructure will flourish – not only now, but into the future. Although the recent bout of unpredictability, variability, and volume will subside, we shouldn’t forget the lessons learned from this data and transactional tsunami: the more products, markets, and investment strategies rely on an increasing amount of information, the more important the way we manage this data becomes, especially in peak periods. But how we prosper when efficiency returns should remain our top priority.

A hardened foundation will not only impact how firms survive this virus-imbued market volatility, but also will determine who thrives when we come out on the other side in this increasingly connected, efficient, and quantitative world. Upon bedrock is built success.

Read Part 1: A Strange and Uneasy Time.

Read more blog posts on Data Matters.

About the Author

Larry Tabb, Independent Analyst / Consultant, Tabb Advisors

Larry Tabb
Independent Analyst / Consultant
Tabb Advisors

Larry Tabb is an independent analyst and consultant at Tabb Advisors, focusing on financial markets, fintech, and capital/financial markets issues. He has also served as the founder and research chairman of TABB Group, the research and strategic advisory firm focused exclusively on capital markets and vice president of TowerGroup’s Securities & Investments practice, where he managed research across the capital markets, investment management, retail brokerage and wealth management segments. He has also served as a member of the CFTC Technical Advisory Committee Subcommittee on Automated and High Frequency Trading (HFT) and testified at the Senate Subcommittee on Securities, Insurance and Investment session on “Computerized Trading: What Should the Rules of the Road be?” Larry has been cited extensively in The Wall Street Journal, Financial Times, Associated Press, The New York Times, CNN, Bloomberg, CNBC, Reuters, Dow Jones News, Barron’s, Forbes, Business Week, Financial News, & other major business media.

Andere Berichten Die Je Misschien Leuk Vindt.

10 Oct 2023
Global Head of Product and Industry Marketing
09 May 2023
Transforming Supply Chain Management for UST with Improved Orchestration and Real-Time Decision Intelligence
26 Sep 2022
Combining both internal and external data improves performance in trading, while gaining greater insights into reducing risk and regulatory compliance.
26 Sep 2022
As digitalization increases in financial services, firms are dealing with a much higher volume of data in a variety of formats, from structured to unstructured.