Blog Post Banner Image
14 April 2020

Data Aggregation: Part Art Part Science

Data aggregation has never been more important than it is today for buy-side firms. Whether you are reconciling back to your counterparties, preparing performance reports for your clients, or considering how to meet existing and approaching industry regulations, post-trade data is at the heart of meeting these goals.

Firms are relying heavily on technology vendors to aggregate data for them. In addition to having scalable processes that can collect data from dozens or even hundreds of sources, vendors can also provide that key ingredient that often plagues financial services firms – data normalization. Additionally, firms that use solution providers wisely are able to take advantage of the vendor’s ability to connect to a wide range of data sources while maintaining consistency in how the data is provided to the consumers.

A long-standing phrase in the data world is “garbage in, garbage out” and this is no less true in the area of mapping account-level information from financial institutions. That is not to say that data from financial institutions is ‘garbage’, rather, it’s a commentary on how important it is to consume data that matches the requirements of your destination system and business goals, in order to ensure the best possible data output.

Any high-quality solutions provider of custodian connectivity insists on having these certainties:

  • Direct relationship: This helps to get data access in its native format directly from the source – either the custodian or broker. Initial communication with the source is equally as important; if the requirement involves data for reconciliation purposes, the directive to the data source involves certifying they can provide - at a minimum - daily positions, transactions, and cash balances on a T+1 basis.
  • Accurate data mapping: This is part art and part science. The goal is to transform the entire range of files from any data source’s native format into files that are compatible with an accounting system. An important step involves turning transaction codes from the data source into standard codes. Many large brokers and custodians have hundreds of native transaction codes that require mapping reliably to a relatively small number of specific transaction codes; the same is true with security types. If this is not done correctly, it can cause serious problems with reconciling your accounts.
  • Testing, Automation, and Monitoring: Later phases of interface development should always involve rigorous testing, including simulated internal testing with destination systems and client beta testing programs. Testing is especially important when creating scalable interfaces for use by the masses.
  • Robust data security infrastructure: Investment managers should have the highest level of confidence in the security and privacy of their confidential data. Any data collection solution in the financial industry should check all the right boxes when it comes to industry-leading safeguards.

The quality and consistency of data delivered to buy-side firms, through broker and custodian interfaces, is highly dependent on the data quality of the data source and a vendor’s data mapping expertise. It is not unusual for counterparties to process certain information differently, to lack detail on specific activities, or to change mapping, thus requiring a vendor to act quickly. Choosing the right data aggregation vendor is critical to ensuring success with your data collection goals.

For more information visit SS&C Advent Data Solutions or contact us.