Posted: 21 February 2013 | Author: Denny Yu | Source: Numerix
Risk management is dependent on data. But post crisis, modern risk management in banking poses unique Big Data challenges – from fragmented infrastructure to increased regulation, product complexity and market volatility. As the regulatory and market environment demands increasingly complex calculations and the volume of data being collected grows, so too do rising expectations of accuracy and frequency of calculations for analysis and reporting.
As banks move from Value at Risk (VaR) to Potential Future Exposure (PFE) and Credit Value Adjustment (CVA) methodologies, the number and scale of calculations for a reasonably sized portfolio have risen considerably. The example below shows the number of calculations on a small portfolio increasing 800 fold when moving from VaR calculations to PFE calculations.
For banks aiming to calculate CVA on a weekly or daily basis, the sheer number of calculations requires considerable compute power and the generation of these data points in minutes verses hours or days.