In 2021, demand for third-party data in financial institutions is growing, fuelled by the wide availability of innovative technology that can analyse many different data types and by the huge processing power and storage available via cloud service providers. Thousands of data providers offer countless third-party datasets that investment banks, asset and wealth managers and hedge funds can use to enhance their financial analysis and the firm’s performance. Yet, the complexity faced by financial institutions in navigating the expanding data universe, identifying the data needed, evaluating its quality and handling the legal complexities of purchasing the data is challenging. Furthermore, firms also face the technical complexities of ingesting, transforming and normalising diverse datasets from a growing number of data suppliers, via multiple channels, formats and protocols. That bottlenecks occur is unsurprising and, indeed, workloads can become unmanageable. However, the rise of a new type of third-party data delivery model – the Data-as-a-Service Utility – can dramatically lessen the burden and allow financial firms to focus on market activities that will differentiate them from their competitors and generate revenue.
As the vast universe of new data expands in both breadth and depth, the delivery of specific data to the point of need within financial firms can be costly and time consuming. While the insights that firms may draw from the data can create a competitive advantage, the workflow to obtain the data does not. As financial firms look to reduce costs, streamline operations and focus
on high-value core competencies, many view third-party data management from discovery to ingestion as a non-differentiating process that can be outsourced. This article explores the Data-as-a-Service (DaaS) Utility model, an approach to third-party data management that can significantly reduce the complexity and cost of provisioning new data sets in financial firms.