Ticker

6/recent/ticker-posts

Ad Code

Responsive Advertisement

EVOLVE21 - an inside peek at data governance at BNY Mellon, the ‘bankers’ bank’

EVOLVE21 - an inside peek at data governance at BNY Mellon, the ‘bankers’ bank’ Martin Banks Wed, 10/20/2021 - 03:48
Summary:
At the recent EVOLVE21 virtual event, Michael King from 'bankers’ bank' BNY Mellon talked through what’s coming in financial regulation and how to stay on top of data governance in the finance world.
Magnifying glass with dollar bills © sirikorn_t – Fotolia.com
(© sirikorn_t – Fotolia.com)

There's a growing burden of compliance on every business, but it places a particularly heavy load on financial institutions — one that is set to grow with a new batch of regulations coming next year. This puts Michael King, Director of Enterprise Data Governance for BNY Mellon, in the hot seat to ensure the bank knows, and can report on, exactly what it's doing with its data. How he goes about this task was the subject of his presentation at the recent Rocket-ASG EVOLVE21 online conference.

As a custodial bank, BNY Mellon goes to market in a very different way to a commercial lending traditional bank. King described it as the `bankers bank', and as such one of the oldest banks on `the street', having been founded in 1784.

His specific area of expertise is in understanding how data governance processes and technology can help manage the forthcoming changes. One of the underlying keys to this is understanding how compliance issues sit in between the classic financial opposites of revenue enhancement and expense reduction. He says:

If you look at it from a data quality standpoint, you can see that companies lose about 12% of their revenue to bad data. And I would suspect that that's probably a low number, but it's at least 12%.

On the expense reduction side of the equation, IBM has estimated about $3.1 trillion across the US is spent looking for, organizing and wrangling data and getting it into a form that can be easily digested. This spend can be in activities such as understanding the lineage and provenance of data, or looking for reconnaissance networks to understand what the net effect of downstream data is.

The cost of incomplete data governance

Getting this wrong can end up an expensive mistake. King notes the recent fine of some $400 million made against Citibank by The Office of the Comptroller of the Currency(OCC). This was levied because the bank was unable to show compliance with BCBS 239 on effective risk data aggregation and risk reporting. The failing was in some of the internal controls and governance programs around provenance of data. King comments:

It's a big deal. It's very costly to banks, financial institutions and fintech companies, as well as other regulated companies. And so it's really critical that this balance is done appropriately.

It is likely to become a bigger deal following the announcement that Saule Omarova has been picked as the head of the OCC. He suggests her appointment will lead to tougher penalties and heavier regulation, with potential for higher restrictions on fintech. So changes in banking over the next three years or so seem inevitable.

All this places even greater emphasis on the need for a firm grip on data — to understand where it's going to, where it's coming from, and therefore understand the business dimensionality across several taxonomies, such as client accounts. He believes this provides a foundation for meeting regulatory requirements as they become ever more stringent.

Some of the trends he sees coming include a need to improve current analytic capabilities. There are inefficiencies in some of the tools available and how they're being instantiated and used, though the maturity of some is now showing through. BNY Mellon uses ASG's Data Intelligence tool for all its lineage, covering some 1,900 applications. He says that the graphs and lineage charts produced have great utility value. It also provides support for a number of international regulatory requirements, such as providing data to show the provenance of specific data points, such as Accounting Rate of Return (ARR) rates. He says:

We're able to identify, through reconnaissance, critical data elements and see where were each of the applications had live ARR rate information in there to make those changes.

APIs as part of the answer

He identifies the use of APIs as a development of growing importance in financial data governance, with their use being applied across new areas such as the creation of metadata exchanges. These can be important in tracking and managing the integrations that occur across multiple different product suites, and services such as catalogs, lineage, classification and controls.

These are now particularly important when dealing with regulators or auditors, for they help demonstrate where data is created, the point of aggregation, through the point of consumption in the form of the reports that go to the Feds or the regulators. Businesses now have to also show where their data quality rules are, and what rules are at which location. They also have to maintain a record of what the data dictionary or data glossary looks like, whether a business is applying the ISO 10962 standard for certified financial instruments, or some other ontology around financial instruments. The need now is to be able to demonstrate clearly that the business is in control of the situation. King explains how BNY Mellon goes about this:

The bank is using ASG's DI tool, and we're finding great utility in it, not only for regulatory, which is a big deal because of the regulatory changes. It allows us to be flexible and nimble to be able to show in any dimensionality, across any application, where data is flowing from point to point, and all the technologies that are moving, whether it's data at rest, in databases, in CSV files, structures, or it could be data in motion, in flight. So we are able to see and track all the data that goes across.

And as the regulations change you have to be more nimble, you have to be able to show in any sort of dimensionality how those data points change, how those elements move, because you're really not certain what's coming next. We just know that there will be change — it will be highly focused around showing more provenance around the data.

For the future King sees an important role for predictive analytics, bringing the ability to determine that a percentage of data is likely to go out of spec over a period of time. At that date and time it is likely that it will fall out of compliance with regulations, so having the opportunity to take proactive remedial action in plenty of time will be a powerful capability. He sums up:

Vendors like Rocket/ASG are doing a fantastic job in being able to create these hybrid models and connect things together and make it easier for the end user — whether they're a data scientist, an auditor, or a chief data officer — to see how their data is flowing.


For more diginomica stories from EVOLVE21 visit our EVOLVE21 event hub. The virtual event ran from October 5-6th and sessions are now available to view on-demand. Click here to register and view now.


Disclosure - Rocket Software is a diginomica event hub partner at time of writing

Tags
Related to the virtual event

Enregistrer un commentaire

0 Commentaires