Skip to main content

In 2021, demand for third-party data in financial institutions is growing, fuelled by the wide availability of innovative technology that can analyse many different data types and by the huge processing power and storage available via cloud service providers. Thousands of data providers offer countless third-party datasets that investment banks, asset and wealth managers and hedge funds can use to enhance their financial analysis and the firm’s performance. Yet, the complexity faced by financial institutions in navigating the expanding data universe, identifying the data needed, evaluating its quality and handling the legal complexities of purchasing the data is challenging. Furthermore, firms also face the technical complexities of ingesting, transforming and normalising diverse datasets from a growing number of data suppliers, via multiple channels, formats and protocols. That bottlenecks occur is unsurprising and, indeed, workloads can become unmanageable. However, the rise of a new type of third-party data delivery model – the Data-as-a-Service Utility – can dramatically lessen the burden and allow financial firms to focus on market activities that will differentiate them from their competitors and generate revenue.

By: GreySpark’s Charles Mo, Head of Operational Risk and Controls, and Rachel Lindstrom, Senior Manager

As the vast universe of new data expands in both breadth and depth, the delivery of specific data to the point of need within financial firms can be costly and time consuming. While the insights that firms may draw from the data can create a competitive advantage, the workflow to obtain the data does not. As financial firms look to reduce costs, streamline operations and focus on high-value core competencies, many view third-party data management from discovery to ingestion as a non-differentiating process that can be outsourced. This article explores the Data-as-a-Service (DaaS) Utility model, an approach to third-party data management that can significantly reduce the complexity and cost of provisioning new data sets in financial firms.

The Rising Cost of Third-party Data in the Financial Sector

Over the last decade, waves of analysts with data science training and a rapacious appetite for data have found a place in organisations across the financial sector. The process of supplying their demands for increasingly diverse data from an expanding number of data suppliers is becoming ever more complex and costly for financial firms. Many of the cost-creating complexities can be found in the data engineering and operations teams.

Typically, a request for data in a financial firm comes from an analyst who sends it to a data sourcing team. Although these teams are adept at navigating data sources, the variety and volume of available data is far higher than it was even five years ago, and this is magnifying the effort to navigate it. Where many sources of the same dataset exist, there may be differences in the format of the data, its quality and, indeed, its price, and it is with various degrees of success that the identification of the most appropriate source is made. In many large organisations, each data request is dealt with in isolation, and multiple requests from analysts in different teams is common. So, where there is no centralised data inventory system, the financial firm will inevitably pay for duplicates of data. Each data contract proffered by a third-party data provider can be quite different in form from those of other providers as there is no mandated standard. The lack of uniformity across data provider contracts means that significant effort is expended by procurement teams to review them. Especially burdensome for larger firms is the technology and information security due diligence that must be undertaken for each new provider, and thus the effort to onboard new providers increases incrementally with each new addition, as illustrated in Figure 1.

After they have been onboarded, the ongoing workload to manage third-party data providers and their datasets is not insignificant, and not only includes contract management considerations for each, but requires vigilance by the procurement teams to ensure the firm avoids paying for data that no longer provides meaningful business value. The ongoing effort to manage new data providers may result in push back from procurement teams on introducing new vendors and instead acquire the data through existing relationships which may not be the best source in terms of quality or price, for instance.

Figure 1: Effort Expended by a Single Financial Firm to Procure and Deliver Volumes of Data 

(Click to enlarge image)

Source: GreySpark analysis

It falls to the IT teams to facilitate the delivery of data to the point of need within the financial firm and this is by no means straightforward. The procured data can be sent by the data provider in one of many formats, partly as consequence of the lack of data standards or common protocols and partly due to the sheer diversity of the data universe; this has implications for its ingestion and presents IT with the first of several challenges relating to the data. The data must be normalised and transformed so that it can be used and stored, and IT teams must develop the capability to continuously monitor the data for changes that the data provider may have made to the schema, as failing to identify and overcome a schema change creates issues downstream. While some of these resource-hungry challenges are adequately managed by larger firms with their more elastic budgets, for all firms, large and small, third-party data delivery management from sourcing to ingestion, creates a significant burden which, if poorly managed, negatively impacts the timely and efficient delivery of data to the point of need.

In 2021, there is a very real struggle in firms to keep up with evolving data trends. The outsourcing of third-party data delivery – a process that delivers no differentiation and many challenges – is an increasingly popular approach. A Data-as-a-Service (DaaS) provider can significantly reduce the effort that a financial firm expends on third-party data delivery, and these benefits are multiplied when the DaaS becomes a utility.

The Data-as-a-Service Utility Model

Cloud technology is widely utilised across the financial services, but primarily for specific use cases by teams for running software, a specific platform or simply for data storage. Unless the financial firm has mandated the use of one cloud provider, different teams can use different cloud providers with each team having built an integration to its cloud provider of choice. As well as the generic advantages that all cloud services offer, there are two key additional benefits that the DaaS Utility model affords to its users.

Due to the speed and simplicity of the process, the DaaS model – the outsourcing of third-party data delivery – is particularly suited to support the ad hoc analysis that is part and parcel of agile trading strategy scenario simulations which are now more of a staple for financial services firms in today’s world rather than a nice-to-have. In the DaaS model, data from multiple third-party data providers is ingested by the DaaS, whatever the format or protocol, and the DaaS delivers it via the cloud to its financial firm client (see Figure 2). The ingestion challenges presented by the diversity of the data delivery mechanisms, file formats and data schemas are absorbed by the DaaS, which ensures that the financial firm receives data in the format that they want. These benefits are multiplied when the DaaS ingests third-party data from multiple providers and delivers via one data mechanism to multiple clients – the DaaS model transforms into a DaaS Utility model.

Figure 2: Taking the DaaS approach, financial institutions receive data via the delivery method, and in the format and protocol they need

(Click to enlarge image)

Source: GreySpark analysis

The hub-and-spoke network nature of the DaaS Utility, shown in Figure 3, provides a layer of abstraction between financial firms and third-party data providers, but also creates an industry catalogue for data. Procurement teams have a go-to source for data. The network effect means that the technical ‘heavy lifting’ that is done by the DaaS Utility for one client benefits all its clients. The DaaS Utility acts as a simplifying conduit for data whatever the original source.

While the role of the DaaS Utility is primarily to provide ‘the pipes’ to move data between those parties efficiently, it can also be so much more. It can assist data consumers with finding data and managing data suppliers, and while the data contract is between the data supplier and the data consumer, the DaaS is perfectly positioned to offer these parties to offer the use of a standardised contract. This can introduce efficiencies into the transactional process for every client – be they supplier or consumer – as both will have multiple counterparties that also utilise the DaaS Utility, and thus, the workload of their procurement and legal teams is much reduced. Data suppliers are keen to take advantage of what is in effect a new channel to their target clients – financial firms – and so they are more willing to use a standardised agreement, which speeds up the due diligence and onboarding of the data supplier by financial institutions.

Figure 3: From DaaS to a DaaS Utility via the Network Effect

(Click to enlarge image)

Source: GreySpark analysis

In summary, any financial firm engaging a DaaS Utility can overcome what have been persistent challenges with which they struggled, as well as gain the benefits summarised in Figure 4. Firms will experience a significant reduction in the effort they expend on API management, third-party data inventory management and cloud integration. The rapidity with which requested data can be made available is likely to dramatically increase, and savings can be made on the maintenance of both data schemas and the data ingestion solutions. The DaaS Utility model is perfectly positioned to deliver data engineering services to financial services. Datasets can be transformed to meet the custom requirements of the end user, and include for instance data aggregation, normalisation and enrichment.

Figure 4: The Challenges of Third-party Data Management and the Advantages of Using a DaaS Utility

(Click to enlarge image)

Source: GreySpark analysis

Relieving the Burden of Point-to-point Connectivity in a Multi-cloud Environment

As cloud technology emerged as a bona fide financial services technology over the last decade, firms are left with pockets of tactically deployed cloud services across their enterprise. A handful of tier 1 public cloud providers deployed a range of cloud services that provide compute power and storage, and managed service providers rushed to put their software and platforms into the cloud. As a result, a disconnected multi-cloud environment evolved across many firms, which necessitated the creation of point-to-point links between each cloud and each team within the financial institution to enable sharing of data between applications hosted in different clouds.

Figure 5: Leveraging the Multi-cloud Connectivity Provided by a DaaS Utility

(Click to enlarge image)

Source: GreySpark analysis

By leveraging the data-driven connectivity of the DaaS Utility model, shown in Figure 5, IT teams do not need to create workflow-driven point-to-point integrations from each team to each cloud; rather, they can leverage the DaaS’s ready-made integrations to access data in their desired destination. In effect, tactical investments in cloud technology are transformed into a strategic advantage by facilitating an integrated application landscape in a cloud multiverse that is less challenged by concentration risk.

The Changing Data Landscape

That financial institutions should be ‘data-driven’ is a simple idea that is complex to realize. It is well understood that many firms struggle with joined-up-thinking when it comes to third-party data. The laudable aim of ensuring there is a clean and efficient enterprise-wide data architecture is often disrupted by tactical implementations at the division- or team-level. The introduction of emerging technologies, such as AI, Big Data and multi-cloud platforms are increasing both the demand for third-party data and the burden on data sourcing, procurement and IT teams. While data management cannot, and should not, be outsourced in its entirety, firms that abstract themselves from the third-party data ‘plumbing’ will release resources that can instead concentrate on providing real competitive differentiation.

Crux Informatics, a DaaS Utility, addresses many of the challenges that financial firms face from data discovery to the ingestion of third-party data, and the firm’s partnership with all major cloud providers removes the need for multi-cloud point-to-point integration. The heavy lifting that the firm undertakes for one provider benefits all, reducing the burden of non-differentiating IT effort in a smart and innovative way, leaving financial firms with the bandwidth to focus on innovation in today’s ever-changing technology landscape. Crux offers a managed data engineering service that helps companies scale their most critical data delivery, operations, and transformation needs. Its cloud-based technology stack accelerates the flow of data between data consumers and suppliers. The firm securely delivers over 14,000 datasets from hundreds of sources, with custom validations and transformations. Crux was founded in 2017 by financial technology veterans and is backed by firms that include Citi, Goldman Sachs, Morgan Stanley and Two Sigma.

Charles is Head of GreySpark’s Electronic Trading Risk Management (ETRM) practice with over 20 years of experience in the Capital Markets Industry, providing a wide range services to global investment banks across multiple functions (Sales & Trading, IT, Risk, Legal & Compliance and Operations) and across asset classes (FX, Equities, FICC).