By Tom Kimner, Head of Global Operations for Risk Management, SAS
Financial institutions’ risk data aggregation and reporting techniques and systems are receiving increased attention both internally and externally. Regulators have stated that it is critically important for an institution to report holistically and accurately on their key risk indicators, exposures, assets, liabilities, etc. across the enterprise for all major risk areas.
For most financial institutions, the need for strong overall data quality, management and reporting practices for risk and financial information is nothing new. Many risk and financial executives will remember that in the early 2000s, the banking industry was investigated for in some cases having grossly inadequate control and management practices around their financial data and reporting. In 2002, the Sarbanes-Oxley Act required banks to improve control of financial reporting – even going so far as requiring top management to individually confirm the accuracy of financial information with the possibility of jail time for those filing false or inaccurate accounts. .
After the financial crisis, the importance of data accuracy for risk and financial reporting became increasingly important as regulators increased stress testing and capital adequacy requirements. Once again common themes emerged around gaps in data completeness and accuracy, problems with aggregation and consolidation of information, reporting errors and deficiencies in control and general management. While regulators worked diligently to tackle capital adequacy issues with stress tests and other quantitative and qualitative studies, they also began to look more closely at risk data aggregation.
According to the Basel Committee on Banking Supervision (BCBS) is the term aggregation of risk data means “defining, collecting and processing risk data in accordance with the Bank’s risk reporting requirements to enable the Bank to measure its performance against its particular risk tolerance / appetite.” The committee adds that “this includes sorting, merging or breaking down data sets.”
When the committee released BCBS 239: Principles for Effective Risk Data Aggregation and Reporting in 2013, it established a set of basic principles for all financial institutions to provide strong governance around their risk data and reporting. The outlined principles are incorporated into local regulatory regimes in each of the participating jurisdictions, including the United States, UK, Canada, China, Germany and many other nations. The principles are built around four broad areas: management, data collection, risk reporting and supervisory requirements.
To take a comprehensive approach to BCBS principles and aggregation and management of risk data
A technology solution that helps financial institutions address all four of the most important principles of BCBS 239 will provide operational benefits well beyond regulatory compliance. Programmatic risk data management will lead to better decision-making across your business. Here are some of the best practices and options to consider:
- management: Banks should have process control and end-to-end transparency of data line and quality rules, as well as change management and audit control. Also critical is full traceability and auditability for compliance with fully documented report generation rules and controls. Finally, standard, easily customizable templates can help meet regional or local regulatory requirements for electronic filing.
- Data Collection: A high-performance risk engine can quickly aggregate positions and exposures and perform a variety of risk calculations with many supported portfolios (such as banking and trading book). Consider aggregation in memory and the ability to process data at the lowest granularity with visual and dynamic query. With this speed and capacity, you should be able to interactively explore and analyze; drill-up, down and across the fly; assembles instantly for various levels of detail; apply custom filters; and easily rotate, group, rank, and sort.
- Risk reporting: A simple, comprehensive look at your data is critical. Having powerful analysis and visualization allows your bank to quickly respond to financial shocks and regulatory scrutiny, knowing your general risk profile and discrete exposures. This ensures accuracy, integrity, completeness and timeliness at all levels of granularity.
- Prudential Requirements: To meet the regulatory requirements, you need advanced data management capabilities for managing and executing business rules, measuring data quality, tracking traceability, and data documentation. Data quality monitoring with operational and data quality measurements can be presented in an interface dashboard style with access to all relevant information. What you want is a consistent approach to getting accurate data where and when it is needed to gain confidence in the accuracy and timeliness of operational and business information.
Strong risk and financial data management is not just about complying with new regulatory requirements. Better, faster consistency and reporting of risk data are important to successfully compete and avoid unnecessary regulatory and reputation hits. Banks need to ensure that their data is complete, accurate and reliable. This is not a luxury for financial institutions; it is a necessity.