Cleaning reference data to make it usable (again)?
Article

Cleaning reference data to make it usable (again)?

Clean and quality data is a major issue affecting all banks around the world. Big data is used across many different processes, so making it reliable has become of paramount importance.

The implementation of various regulations is forcing banks to look at their reporting more than ever. Over time, transactional reporting has also increased the level of detail required. If the data used by a bank is poor, this will not only affect the quality of reporting, but will also harm a bank’s integrity.

 

The Markets in Financial Instruments Directive II (MiFID II) is the latest piece of legislation which the banks are grappling with. The implementation of MiFID II has been postponed to allow the European banking industry additional time to improve their data quality. Meeting this new standard requires a huge tectonic shift.

 

Data quality is not only reserved for the European banking industry either. The implementation of the US Dodd-Frank Act has global ramifications as well. All of this amounts to a constant review and re-review of data quality and standards across the world. Even after three years since its implementation, Dodd-Frank is initiating a consultation to review data quality, report formatting and clarity in reporting.

 

What’s obvious is that with each new regulation comes the requirement for improving data quality. If the data quality is not high enough for a regulation, the cost to fix and the reputational damage can be huge.

 

As part of the MiFID II implementation process, the European Securities and Markets Authority (“ESMA”) must collect data on about 15 million financial instruments from around 300 trading venues. The delay in the application of MiFID II is due to concerns about the technical implementation challenges faced by financial market participants, national regulators and the European authorities. When the deadline arrives, the regulator will expect data quality to be extremely high.

 

The delay has given institutions the opportunity to revisit their strategies, review their options and make sure that they are doing the right thing. Firms are rolling-out software to test their output and creating robust test environments for continuous improvement.

 

The new rules will herald a change in the way firms handle and manage data. Banks will have to store greater amounts of data since no-one know how granular the final reporting requirements will be.

 

Historically, client accounts, trades capture and market data was stored in different databases. The new reporting requirements will require all this data to be merged and reported together – forcing a rethink of the infrastructure that holds this data. What’s more, the pulling together of all this data presents a risk in itself – data being fed from different systems and locations could increase the risk of system outages. Firms will have to ensure that their systems are robust enough to avoid any seismic failures.

 

The arrival of more and more regulation means that firms will have to invest in data quality to ensure that all regulatory requirements are met. The regulators have already made it clear that firms are responsible for the quality of their own data and must make sure that any reporting is both complete and accurate. Failure to comply will lead to even more misreporting and ultimately, more fines.

 

Steve Tang, Senior Consultant at Axis Corporate.

Tags related

Scroll Up