While it is a daunting challenge to attempt to distill the provisions of the 2,300-page Dodd-Frank Act into simple terms, one theme pervades nearly every aspect of the legislation: the expected breadth, depth and frequency of reporting to be required and potentially requested by regulators either ad hoc or periodically.
The implications of these expanded reporting requirements are significant for institutions' underlying data and application architecture. Data will likely have to be presented in a prescribed format and a much more granular way than traditional reporting to support the aggregation and cross-industry analyses that are central to macroprudential supervision and regulation.
Organizations will have to address the modeling, calculation and connectivity capabilities of their applications. Moreover, in view of supervisors' reliance on the reports for prudential supervisory purposes, high-quality data will need to be maintained to develop them. The days of satisfying regulators' requests by simply sharing whatever reports an institution uses internally are effectively over.
Many firms will have to make a considerable investment in IT infrastructure, data architecture and attendant controls. And though it appears there is a fairly extended timetable to fulfill most of the legislative requirements, there is no time to wait on the systems front, given the effort needed for such initiatives and the fact that they lie in the critical path for meeting what are expected to be a raft of compliance mandates.
Regulators generally maintain that systems and data issues have not received sufficient attention. The Senior Supervisors Group, for instance, observed in October 2009 that firms faced considerable challenges in developing needed information management systems, partly as a result of IT infrastructures that were the byproduct of mergers and acquisitions over many years. These shortcomings were seen as having constrained the ability of many institutions to aggregate and monitor exposures and impeded risk identification and risk management.
The difficulties some organizations had in responding quickly to far-reaching information requests in the throes of the crisis and afterward — particularly such issues as funding and liquidity and the Supervisory Capital Assessment Program — are also worthy of mention. Add to this history the expectation that reporting will likely be more extensive and frequent than standard quarterly financial reporting — covering such matters as counterparty exposures, concentrations, margining and recovery and resolution plans — and the challenges become clear.
Many institutions may consider a wait-and-see approach to better understand the requirements for implementing the regulations and reporting. However, being proactive in allocating additional resources to data and infrastructure improvements in the near term could bring strategic benefits. Regulators will be exerting increasing influence over the growth and expansion of regulated firms at the same time as many institutions are reconsidering aspects of their business models and seeking to be agile in responding to new opportunities. Expansion adds complexity to an operational and systems environment, so it is fair to assume regulators may impose constraints on expansion if they conclude systems and data infrastructures are not up to the task.
Firms that have spent years implementing the Basel II regulatory capital framework will appreciate the effort needed to build robust, sustainable data management systems to meet the exacting requirements. Similar standards likely will be applied to a wider array of risk and financial data to meet the new reporting requirements. It is critical to get in front of these challenges rather than wait for the regulators to force the issue. That way, the initiatives can be directed at adding value to the business rather than at meeting compliance requirements.
Donald Vangel is a principal in the financial services office of Ernst & Young LLP.
This commentary originally appeared in American Banker.