For too long asset managers have been focusing on their data warehouse initiatives, often spending many years building them out, by which time the company in question has moved on and acquired other businesses with new data silos, warehouses, databases, data marts – you get the message – just as one data warehouse project finishes a new ‘enterprise data warehouse’ project is needed to bring the latest silos under the same umbrella. So the “information stores” which a business uses to deliver data to the external world are constantly changing – which leads to a lot of client and vendor related unhappiness.
At the same time the client reporting teams have been working away to build a client reporting infrastructure to deliver bespoke glossy reports, often without taking into account that a client report that is generated with poor quality data, is still a poor report, even if it is completely tailored to the client needs and is very flashy and glossy to the eye of the (be)holder. Client reporting vendors are left to carry the can to a large extent – the asset manager expects the client reporting vendor to clean the data up, while the client reporting vendor expects the asset manager to deliver clean data !
The marketing teams are busy to trying to automate the production of the product fact sheets – again the fact sheet automation vendor community place a strong reliance on good quality data being submitted to the process, while the asset managers to a large extent assume the data is good quality, why else would they allow it enter an automation process?
Micro-site content publication suffers from the same ills as the fact sheet automation process, except it is even more acute as the data is expected to be updated daily, and not monthly or quarterly.
The legal and compliance teams are at the same time busy working with their financial printers getting the simplified (or summary) prospectus, KID (if you’re in UCIT IV prep mode), annual/semi-annual reports and any other regulatory product documentation automated. Again though, the financial printers expect that they receive clean and timely data.
So what we have is an industry that unwittingly expects it’s vendors to work with poor quality data to generate high quality output – i.e. it is set up to fail miserably when it comes to getting good quality (= timely, consistent and accurate) data to the end investor, be that via a custom client report, a product fact sheet update, a web page view or a regulatory document.
If asset managers applied the same principles that network engineers apply to physical networks to protect the integrity of their inner network, by applying an “information firewall” that presented all client facing data to each of the vendors that required access to that data they would have; (a) a much happier vendor community and (b) superior content being presented to their end clients.
So an information firewall for investment product data should do some or all of the following;
- It should prevent inconsistent and inaccurate data hitting the external world by applying business rules to the content to ensure that erroneous or inconsistent data does not flow to the external world.
- It should aggregate data from the myriad of different silos within an organization and present a common ‘client facing data model’ such that the flux in the background is hidden from the down stream consumers of the aggregated content view.
- It should be flexible with respect to how and where data is sourced, sources need to be ‘plug and play’ such that rapid changes can be made to the back-end without impacting the external view of the content, allowing a company to evolve scaleably
- It should provide a mechanism to assign ownership and accountability of specific data points/domains to specific owners such that a strong data governance model can be applied.
- It should allow centralized oversight of the data quality management process, while allowing for distributed ownership of data domains
- It should allow for external views to be easily customized for the respective consumers of content being delivered to the external world
- It should provide multi-tiered MIS to ensure the system and processes around it are transparent
- It should enable workflows be built around exception management to ensure that data quality issues are fixed in a timely manner such that downstream consumers are not delayed in generating time critical output
While the deployment of an “information firewall” alone should not in itself be seen as the defacto solution to solving client facing data quality problems, it is a key element of what is needed to setup a successful investment product data quality management process – remember technology is not a panacea to all data quality ills!
Filed under: Data Quality, Regulation, Technology Tagged: accuracy, annual report, asset manager, automation, business rules, client facing data, client report, client reporting, compliance, consistency, data aggregation, data mart, data model, data owner, Data Quality, data quality management, data silo, data warehouse, EDW, enterprise data warehouse, fact sheet, factsheet, factsheet automation, financial printer, information firewall, information store, institutional, investment management, Key Information Document, KID, legal, micro-site, MIS, regulator, regulatory documents, retail, simplified prospectus, summary prospectus, tear sheet, timeliness, UCITS, vendor, workflow
