Skip to content
Search to learn about InterSystems products and solutions, career opportunities, and more.

Three Ways to Simplify data Architecture with a Modern Data Platform

white data points connected by thin white lines over a background with a warm gradient

As more optimistic economic conditions emerge after the pandemic, organizations need a single, real-time view of accurate and reliable data so they can deliver value to customers, reduce risk and respond rapidly and effectively to new opportunities and challenges.

That’s far from easy however, for the many organizations that have accumulated multiple and diverse technologies to manage different types of data and workload. Perhaps through no fault of their own, IT leaders in these organizations face unnecessary architectural complexity and latency. They must deal with many moving parts and the need to move data among them.

With the overall aim being greater agility, architectural simplification is a real goal because it reduces costs, increases performance and helps improve security. It is achievable and within every organization’s reach, and here are three ways to achieve it.

The multi-model pathway

The first of these pathways to simplification is a pure multi-model database. This replaces multiple, special-purpose database management systems such as key-value, document or graph databases, with a single representation of data on disk. It is accessible as any representation on-demand and without data duplication.

Many areas of business can employ this multi-model approach. In the hospitality industry, for example, a single hotel booking application may rely on as many as five database management systems, a set-up likely to adversely affect the high availability of data. Each store has a separate model for scale-out, load, disaster recovery, availability and security, which results in data duplication and many opportunities for things to go wrong.

Applications working on such systems must undergo extensive testing, while reusing data becomes problematic. Besides the necessity to learn multiple products, debugging and support eat up much employee time, increasing total cost-of-ownership.

Compare this with how a simplified hotel booking application reduces development costs by a factor of three. A single, multi-model system is comparatively easy to scale and provides easier access to data from a common pool. It is also simpler to refactor in line with changing requirements and eradicates cross-system latency.

The translytical data platform

The second pathway to a simpler architecture depends on the transactional-analytic – or “translytical” – platform. This combines transactional and analytical data management capabilities in a single database engine. Whereas a transactional system optimizes transaction-processing, an analytic system optimizes queries and analytic workloads. Putting such capabilities together in one database engine delivers high performance without any compromise when using analytics for real-time insight and actions. Advanced data platform technology puts this consolidated architecture into position, eliminating latency when moving the data between systems. It achieves this without sacrificing performance or scalability for either workload type.

In global financial services the deployment of a single translytical platform enables major investment institutions to process billions of trades and orders per day while simultaneously servicing thousands of concurrent requests per second from hundreds of applications across the enterprise. In one organization, trade throughput has increased by more than a factor of three, while data ingestion has increased tenfold. This new approach has shrunk operational costs by 75% - a remarkable figure by any standard.

In-database machine learning

The third and most recent pattern for simplification is in-database machine learning (ML). It eliminates the need to have separate platforms for data management and data science and overcomes the many barriers to the optimization of an advanced technology such as ML. That includes the ever-present shortage of expertise and problems with usable data.

Simplification here comes in the form of AutoML or IntegratedML, which embeds ML technology directly within the data platform. This makes the development of ML models more readily accessible to staff who understand the business problems but may not have extensive data science expertise. For the data scientists within an organization, AutoML offers another major gain – it gives them back more time for productive work on higher value activities such as tuning and evaluating models. It is also easy to embed these machine learning models into processes and workflows to create intelligent, data-driven prescriptive processes that execute in real time based on events and transactions.

Architectural simplification will deliver right away

Whichever pathway to simplification organizations with complex architectures pursue through advances in data management technology, they will gain significant operational and cost advantages. These include substantially reduced total cost of ownership, higher levels of performance and efficiency, along with all-important scalability and resilience. These are gains that are immediately available, irrespective of whether the deployment is on-premises, in the cloud or in a hybrid environment.

As we have seen, advances in database management have banished to history the old notions that simplification of data architecture was at the cost of performance. Such alternatives that are available generate much greater complexity through deployment of multiple-point solutions and put data at risk by requiring it to be in motion where it is more vulnerable. The time has come to realize that simplification has many advantages over the best-of-breed approach. The benefits of simplification are clear enough to speak for themselves.

To hear more from Jeff on the pros and cons and the trends to help you understand what is possible with multi-workload and multi-model databases, listen to his full recording from the IASA conference.

Read the latest blog posts on Data Excellence.

Related Blog Posts

As digitalization increases in financial services, firms are dealing with a much higher volume of data in a variety of formats, from structured to unstructured.
Combining both internal and external data improves performance in trading, while gaining greater insights into reducing risk and regulatory compliance.
Revenue cycle management has come a long way, thanks to the integration of data interoperability solutions with electronic health record systems.
According to a new report sponsored by InterSystems, data democratization will be key to driving innovation in the financial services industry.
Global Head of Product and Industry Marketing
InterSystems IRIS 2022.1 will allow application developers to leverage data more easily and powerfully than ever, with improvements in several key areas.
Director of Product Management - Data Platforms, System Development
Integration of the FHIR standard is helping developers build the applications, tools, and technologies to facilitate collaborative care.
Head of Developer Relations, Data Platforms