FRTB’s tech challenges offer a platform to build the ‘Bank of the Future’ - How ready are you?

In January 2016, the Fundamental Review of the Trading Book (FRTB), or unofficially as some have termed ‘Basel IV ‘- the long expected final version of the revised market risk framework, was published by the Basel Committee of Banking Supervision (BCBS). The new regulation ‘Minimum capital requirements for market risk’ aims to tackle often-criticised shortfalls in the current Basel 2.5 regime and reduce the variability of market risk weighted assets (RWA) across jurisdictions. It is widely considered to signify a seismic shift in market risk quantification and management.

FRTB introduces many well documented and complex changes, specifically the use of Expected Shortfall (the average of all losses which are greater than or equal to VaR, i.e. for 95% VaR, Expected Shortfall will represent the average of outcomes in the worst 5% of the cases) rather than VaR for calculating the market risk capital requirement, model sharing, applying P&L attribution at desk level and separating out the banking and trading books. All of which create significant challenges, with key issues identified in Chartis’s 2016 FRTB survey as:
• Implementing P&L attribution as a desk level performance metric
• Managing the increased volume of model outputs arising from both the internal model and revised standardized calculations
• Tracking and managing trading desk-level model approval and testing
• Sourcing data for ‘non-modelable’ risk factors
• Defining trading desks for the purposes of FRTB
• Calibrating correlations in the incremental default risk charge to price data
• Handling the massive increase in computation needed to run Expected Shortfall calculations
• Back-testing Expected Shortfall
The move to Expected Shortfall method alone will drive a huge increase in data processing and, when combined with the revaluation requirement for all non-linear risk positions, the incorporation of liquidity horizons and intraday risk monitoring, will require an exponential increase in compute. Ultimately banks need to collect, process and report on more data, in more ways and sooner.

Notably missing from the Chartis survey is the timeline challenge for FRTB. Perhaps this is because it doesn’t come into effect until 1st January 2020, which is almost 3 years away, right? Well you can only go live with what you have had approved by your regulator and historically they’ve taken around a year. Presumably they've increased their capacity in preparation for the flood of FRTB applications, but even so, it’s prudent to keep the year's approval time.

That allows us until 1st January 2019; but you have to submit a year's worth of testing! Now that's a year of runs which, in theory, you could run on 31st December 2018, but we already know that it's going to be so computationally intense that most banks will struggle to do it in a day. You've also got to factor in testing, analysis and defect fixing so that your massive new analytic and risk framework is ready for regulatory submission – that's at least a year's work!

The date is now 1st January 2018 ... and that means your solution needs to be ready by the end of this year. And that means you need to be working on it today, but not everyone is.

And whilst the timeline is a significant challenge and risk, managing remains the biggest challenge of all. From a technology standpoint FRTB (and BCBS239) compliance will be incredibly expensive and a put significant additional strain on resources - integrating Front Office systems with Risk and Finance; deploying, testing and approving new models; introducing new data sources and 10 years of validated history for every risk factor; availability of highly scalable compute and data storage, will all be major new, and expensive, initiatives for a vast majority of banks.

But there is a new beginning for those firms who use these new regulations as a business case to invest in new technology. In fact, smart Financial Services firms are looking at the challenges the regulation presents as a way of developing competitive advantage whilst minimizing risk and maintaining compliance.

The need for the bank to consolidate traditionally disparate systems and integrate Front Office platforms (with their own trading, booking, pricing and reporting systems for each desk or asset class) with Risk, Treasury and Finance systems will lead to consistent data sources and data sets from front through risk to finance, offer demonstrable lineage, common taxonomy and analytics that can be used by Risk and Finance, therefore reducing time consuming data sourcing, aggregation, reconciliation processes and duplication of data and calculations (amongst other historical inefficiencies).

FRTB is also an opportunity to leverage Big Data, Business Analytics, Cloud, Compute Grids and Data Science technology to enhance models, market data management and understanding of macro level interdependencies.

Big data platforms such as Hadoop, when coupled with big data analytics (for the transformation and enrichment of data) and data science platforms can store and analyse more data for trends and to gain valuable insights. Expected Shortfall data, for example, could be used to break down client dimensions and then plot them across time to give valuable insights into seasonal risk trends across different clients which could then be used to generate more business or to mitigate risks with certain clients during certain periods. In fact, predictive and prescriptive analytics tools, when used to help manage and interpret the huge quantities of data that FRTB (and other regulatory reporting requirements) requires, will offer opportunities outside of regulatory compliance for better trend spotting, fraud and insider dealing prevention, predicting regime changes, and sensing market sentiment which can all provide new competitive advantages.

On a practical level, simply by bringing the analytics to the data e.g. using in-memory analytics especially for the aggregation and analysis of data, rather than having huge volumes of data moving around different systems and messages queues, the bank can deliver quick and efficient insights into the risk profile of the bank in near real time. Furthermore, by implementing workload automation within a Big Data platform, banks can also save time, prevent errors, reduce headcount, and deliver innovation and business services faster.


Of course, the key to successfully leveraging any of these solutions, whilst ensuring the regulatory requirements are met, must lie with the ability to improve overall data quality and management. In this domain we expect to see data sourcing and data management vendors such as Asset Control, GoldenSource and Markit move to delivering shared services models for the banks - unified models for the collection of transactional data, identification of gaps, introduction of new sources, and validation of the ten years of history required across every single risk factor. If this is possible (it has its own set of challenges) then this model would also represent a significant opportunity for banks to increase efficiency and outsource a process that has been considered by many as a necessary evil and therefore underinvested in for many years.

The extent to which the banks are leveraging these opportunities, and the level of readiness of the banks, varies significantly and is heavily influenced by their historical technology choices. Some banks, for example, believe their existing infrastructure will handle the new data and analytics requirements and they will simply aggregate across them to meet the new reporting requirements. This, however, is limited to the handful of banks that have typically built bespoke and integrated trading and risk platforms using common analytics with a central data warehouse and large highly scalable grid farm.

Others have identified where bottlenecks and missing data will reduce calculation efficiency and are adding Big Data, Business Analytics, Data Science and disruptive FinTech ‘point’ solutions to address the shortfalls, whilst banks that are already using third party trading and risk management solutions such as Murex, Calypso and IHS Markit are looking to them to provide FRTB functionality. The third party vendor approach removes much of the traditional design and build complexity (risk engine design, exposure management and standardised regulatory calculations) leaving the bank to focus on changes to the business process and implementation of risk models. This makes sense particularly where a vendor platform is already being used across multiple asset classes and for end to end processing. Equally, it may push the bank to rationalise multiple vendors onto a single consolidated platform that provides native FRTB functionality or will complement existing system consolidation initiatives. The argument for such initiatives is strengthened by the need to centralise all data required to calculate FRTB outputs, be that empirically through trade and market data or aggregation of PL vectors.

But this only scratches the surface of the complex technical challenges and opportunities that are presented by FRTB. What we do know is that the resulting choices and their impact will certainly be extensive across both the bank and the industry. The fact that these requirements and changes are converging with a cycle of major FinTech innovation, a resurgence of Data Science and an era of cheap compute can only help improve the quality, functionality and adoption of technologies normally the preserve of industries and markets renowned as cool, exciting and pioneering – perhaps this is the wakeup call needed for banks of all sizes to re-think how they future proof their technology and ensure its fit for purpose for the 2020’s?

Author: Adrian Marshall.
Contributors: Nick Thomas, Marc Maynard and Christian Marshall

Comments

Not to be published