Basel III Data Aggregation – Asking the Impossible?

London, 1st September 2012 – During the peak of the financial crisis many banks struggled to pull together information for the regulators. There were numerous examples of banks providing inaccurate information to regulators. Uncertainty on the true financial position of banks made it that much harder to put together a rescue. Perhaps the most egregious example is Hypo Real Estate. In 2011 having been rescued by the German government the company announced a Euro55bn accounting error that was attributed to a spreadsheet mistake. This is not unique and there have been many similar, but less public examples.

Regulators have been slow to realise the importance of effective IT systems. Many Basel II reporting systems relied on spreadsheets to pull together the numbers. There was no explicit requirement for controls over the data and many regulatory reporting systems turned out to have major weaknesses.

The June 2012 Basel committee consultation document on “Principles of effective risk data aggregation and risk reporting” attempts to address this problem. It sets out core principles around the aggregation of risk data. For the first time in banking regulation there are explicit requirements for accuracy, completeness and timeliness.

At first glance the principles appear sensible. Who can argue with statements like “Risk management reports should cover all material risk areas in the organisation”?

However there is a contradiction at the heart of the document. The principles require risk systems to be accurate, comprehensive, real time, flexible, forward looking and clear, and able to rapidly adapt whilst having the highest level of control.

The question is how to build the systems and processes that can achieve these objectives?

The conventional IT advice is to use an ETL tool to extract data from product processing systems into a data warehouse that stores risk information. Banks have been working on this approach for years with very mixed success. These risk aggregation projects tend to take years, cost a great deal and often fail. Many banks have spent hundreds of millions of dollars on such projects and been forced to cancel them.

In reality banks rely on a patchwork of systems held together with spreadsheets. The spreadsheets are there, because even well managed IT projects just take too long to deliver a useful result. Department managers have to deal with problems as they arise and develop their own manual processes, using spreadsheets and other desktop tools. These processes quickly become core to the business and huge amounts of time and effort have to be invested in the spreadsheet processes. When the organisation decides to automate the process with a formal system the investment in the old process has to be thrown away and completely replaced with a new process on new technology. This makes the project significantly more expensive and riskier.

So the key question is how banks can meet the Basel committee requirements? There is ample evidence that conventional monolithic data warehousing projects will only work at an eye-watering cost even if the organisation has the technology and project management skills to execute such a project.

Fortunately a new type of technology is beginning to emerge, which can tackle the problem. Spreadsheet control and automation technology allows the organisation to gradually turn its spreadsheet processes into formally controlled systems. They do this by providing the necessary control functions that are missing from conventional spreadsheet applications and then allowing the users to automate their processes step by step. Users can start by getting existing spreadsheets under control and then incorporate their spreadsheets into a controlled and automated workflow process. The end result is a highly automated reliable process that can be quickly reconfigured to meet new demands on the business. All this can be done, whilst placing few demands on IT. The really surprising feature of these projects is that because they build on existing processes and are incremental, they have an excellent return on investment and are low risk.

For many banks this is probably the only available approach to addressing the Basel III requirements. They are stuck with spreadsheet driven processes and may not be able to afford the costs and risks of giant data warehousing projects.