Blog Post Banner Image
16 August 2021

Solving the Mystery of Data Aggregation

Data aggregation has never been more important than it is today for buy-side firms and technology service providers. Whether you are reconciling back to your counterparties, preparing performance reports for your clients, or considering how to meet existing and approaching industry regulations, post-trade data is at the heart of meeting these goals. Flexibility with delivery and support structures, along with breadth and reliability of data source coverage, has become increasingly vital to enable firms to best support their unique client needs.

These evolving needs are pushing firms to rely on technology vendors to aggregate data for them. In addition to having scalable processes that can collect data from dozens or even hundreds of sources, vendors can also provide that key ingredient that often plagues financial services firms – data normalization. Additionally, firms that use their solutions providers wisely can take advantage of the vendor’s ability to connect to a wide range of data sources, while maintaining consistency in how the data is provided to the consumers.

Any high-quality vendor of interfaces insists on having these certainties:

  1. 1) A Direct Relationship with the custodian or broker to access data in its native format, directly from the source. Initial communication with the source is equally as important; if the requirement involves data for reconciliation purposes, the directive to the data source revolves around ensuring they can provide at a minimum daily positions, transactions, and cash balances on a T+1 basis.

  2. 2) Accurate Data Mapping - part art, part science. The goal is to transform the entire range of files from any data source’s native format into files that are compatible with your accounting system. An important step involves turning transaction codes from the data source into standard formats that fit your solutions codes. Many large brokers and custodians have hundreds of native transaction codes that require mapping reliably to a relatively small number of specific transaction codes; this is also true with security types. If this mapping isn’t done correctly, it can cause serious problems reconciling your accounts.

  3. 3) Testing, Automation, and Monitoring - Later phases of interface development should always involve rigorous testing, including simulated tests with the destination systems and also with beta testers, especially interfaces that are being created for scalability and used by the masses.

  4. 4) A robust data security infrastructure - Investment managers should have the highest level of confidence in the security and privacy of their confidential data. Any data collection solution in the financial industry should check all the right boxes when it comes to industry-leading safeguards.


A long-standing phrase in the data world is “garbage in, garbage out” and this is no less true in the area of mapping account-level information from financial institutions. That is not to say that data from financial institutions is not of a high quality, rather, the ‘garbage’ saying is more a commentary around ensuring you are getting the best output of data that matches the requirements of your destination system and business goals.

The quality and consistency of data delivered to buy-side firms through broker and custodian interfaces are highly dependent on the data quality of the data source and a vendor’s data mapping expertise. It is not unusual for counterparties to process certain information differently, to lack detail on certain activities, or to change their mapping, requiring a vendor to act quickly.

Choosing the right data aggregation vendor is critical to ensuring success with your data collection goals. For more information visit SS&C Advent Data Solutions, read our product brief, or request a demo.