Skip to content
Search to learn about InterSystems products and solutions, career opportunities, and more.

Gaining Competitive Edge: The Data Arms Race in Capital Markets

Group of Young Business People Meeting In Modern Work Space

Over the last decade, much has been written about the rising tide of external data sources from which firms can gain information about particular market trends or opportunities. However, there are many internal sources of data that firms have yet to mine fully for business insights and profitability, writes Virginie O’Shea, CEO and Founder of Firebrand Research. In this article, Ms. O'Shea discusses the benefits of combining both internal and external data to improve performance in trading, while gaining greater insights into reducing risk and regulatory compliance.

Capital markets firms are facing competition from many corners, from the continued dominance of passive investing on the buy-side to retail brokers that operate like quasi-fintechs on the sell-side. Gaining competitive edge over both new entrants and incumbents therefore relies on better use of all of a firm’s available assets.

That means gleaning valuable insights in real time from a wide variety of data sources, both internal and external, to better arm your front office decision-makers and trading desks. It also means better understanding your clients’ activities and tailoring your services more appropriately to their requirements.

Over the last decade, much has been written about the rising tide of external data sources from which firms can gain information about particular market trends or opportunities. Whether it’s the evolution of sentiment analytics based on social media data or the addition of new sources of data for tracking environmental information such as satellite imagery, data has been at the forefront of firms’ development of innovative trading strategies, and it will continue to play a significant role in the future.

However, there are many internal sources of data that firms have yet to mine fully for business insights and opportunities, as well as building a 360-degree view of a firm’s clients. Moreover, it’s often the combination of different data sets that makes for greater insights.

Combining internal transaction data with external sentiment data, for example, can highlight patterns of behavior over time that can feed into predictive models. From a transformation standpoint, the successful implementation of artificial intelligence (AI) and machine learning (ML) also requires a high volume of high-quality data.

The evolution of the sell-side front-office has seen personnel change from pure sales traders to quants, who rely on these technologies and reliable data to deliver better returns. It has also witnessed an increasing role for risk management in a front-office context, with related increased demand for real-time risk data analytics.

Technology has become a catalyst for change and the deployment of next generation tools to handle more volume and to better manage risk is a competitive advantage. On the buy-side, there is also an efficiency play around better managing data to reduce the transaction costs for client portfolios. Staying ahead of market risk is key to better managing liquidity, which has been a regulatory concern over the last 24 months within the funds space.

Static reports based on stale data won’t cut it in today’s fast-moving market environment. The past two years have been characterized by market volatility and black swan events, which make up-to-the-second information critically important. For example, think of the difference in responding to breaking news as it happens versus a few hours or even days later. Everyone in the market understands that revenues can be seriously negatively impacted due to latency of information from a trading perspective. Yet C-suite executives often rely on internal reports that are based on stale data due to legacy technology and operational silos.

Many large financial institutions are in the throes of a multi-year transformation program, and most have placed data alongside digital as a pillar of their strategies for the future. After all, the target of aligning business units from a horizontal perspective across the organization can only be achieved via the introduction of a common data foundation.

From a cultural perspective, transformation requires lines of business to be on the same page as each other and greater collaboration can be enabled via the sharing of common data stores. This is where many firms have introduced application programming interface (API) platforms, taking a lead from the retail banking industry in Europe and its focus on open banking.

The move to a cloud environment is also part of this journey and given the regulatory and industry focus on resilience, ensuring the firm is able to switch cloud providers in the future, if required to, is increasingly important. Avoiding cloud platform lock-in is also a commercial imperative for firms that wish to retain negotiating power with their service providers. A data disintermediation layer between a firm and its cloud or software as a service provider is therefore important to enable faster onboarding and future resilience.

However, firms cannot stop and rebuild everything from scratch; they must work within the parameters of their existing technology architectures. APIs also expose the quality of the underlying data from source systems, which can result in a ‘garbage in, garbage out’ quality problem. This is where technologies such as data fabrics can come into play to normalize data from multiple sources and allow it to be consumable by downstream systems and applications. Business decision-makers can then rely on the high quality of the data on which they base their strategic insights, risk management and future plans.

The industry will continue to find new data sources to exploit and add new data-intensive services and technology applications. Just look at the growth of digital assets or the rise of environmental, social and governance (ESG) investment strategies for proof of this dynamic over the last few years. As these new products and services evolve, the complexity of data is only going to increase, and the volumes will grow as firms add more required data sources. How teams surface insights from this data and how quickly they can turn these insights into client-focused activities is already an arms race within the industry. Adding AI and ML into the mix will accelerate the race further.

Firms will also continue to grow organically and inorganically. Silos are almost impossible to eradicate, so learning how to better live with them is part and parcel of a digital transformation program. Interoperability has become a keyword within the industry, due to the need to connect multiple entities, internal systems and market infrastructures. Interoperability isn’t something that firms should only demand from external providers; it is equally important within the four walls of their organizations. At the heart of interoperable operations is the simplification of the data stack, not by ripping and replacing but via augmentation.

There is little appetite for big bang projects that require multi-year implementations with promised return on investment (ROI) several years down the line. Firms want to become more agile and deliver incremental benefits to their clients sooner rather than later. Building vast new internal platforms from scratch is therefore something best avoided, especially given the ongoing burden to maintain these systems over time. Partnerships with specialist vendors, who can offer the right level of expertise and support is more palatable for those looking for faster deployment and incremental delivery.

Building a more resilient financial institution that can withstand the volatility of the markets, address the new product and services whims of its varied client base, increase its overall agility, and stand its ground against its competitors, new and old, requires a solid data foundation. Firms with reliable, accurate, on-demand data can adapt as the market, regulators and their clients demand.


Virginie O’Shea is CEO and Founder of  Firebrand Research. As an analyst and consultant, Ms. O’Shea specializes in capital markets technology, covering asset management, international banking systems, securities services and global financial IT. This blog post was originally published on TabbFORUM.

Other Posts You Might Like

Oct 17, 2023
Global Head of Product and Industry Marketing
May 09, 2023
Transforming Supply Chain Management for UST with Improved Orchestration and Real-Time Decision Intelligence
Sep 26, 2022
As digitalization increases in financial services, firms are dealing with a much higher volume of data in a variety of formats, from structured to unstructured.
Aug 24, 2022
Silos, complexity, and governance requirements are making it difficult for financial firms to democratize data. Here’s how new data technologies could help.