When regulation increases so does the need for automation. Otherwise, the cost of compliance and risk of violations become unsustainable. The upcoming Fundamental Review of Trading Book (FRTB) requirements provide an excellent example of how enterprises can bring together Big Data and workload automation to effectively meet compliance requirements while minimizing the burden on staff. BMC is currently working with several leading international financial institutions to meet the FRTB challenge and we’ll share some perspective here.
FRTB is a new set of regulations by the Bank for International Settlements (BIS) that redefine risk management methodologies and reporting requirements for financial institutions. The regulations go into effect January 1, 2019 but only 15 percent of organizations surveyed said they were on track to meet the deadline as of August 2016, according to this informative report.
FRTB changes the formulas for how risk is measured and requires new factors to be considered. One of the most significant changes is that institutions are required to move from the Value-at-Risk (VaR) model that is commonly used to calculate capital risk in favor of the Expected Shortfall method. This is a fundamental change that will drive significant computational challenges and a tremendous increase in the amount of data that is generated and collected. According to one analysis, compliance using Monte Carlo methodology and existing technology will require 20 times more calculations, take 20 times longer, and add a corresponding amount of additional data storage capacity. The same analysis also predicts that the resulting increase in time, effort, computational and storage costs will cause financial institutions to shift to Historical Simulation methodology in order to contain costs and complete calculations within reasonable SLA timeframes.
Calculating and reporting Expected Shortfall is just one of many changes that FRTB imposes on financial services institutions. The net effect of the new regulations can be summarized as requiring financial institutions to collect more data, process and report it in more ways, and do it all more frequently.
Some BMC customers believe there is a better way. They are taking digital transformation in new directions to meet FRTB and other regulations effectively with minimal impact on staffing. They are doing this by using Hadoop to meet FRTB data requirements, and by using Control-M to bring Hadoop into the enterprise IT architecture.
Having a Big Data infrastructure isn’t an official FRTB requirement, but it is a de facto one for complying in a cost effective, timely manner. Hadoop is an especially effective enabling technology for meeting FRTB requirements because it is so scalable. Hadoop works very well for importing and exporting large amounts of data (both structured and unstructured) from different sources, and distributing the processing workload to minimize strain on the computing environment. Hadoop makes it possible to frequently execute the data analysis and reports needed for FRTB compliance, including intra-day trading analysis.
Hadoop can quickly process large amounts of data and generate reports, but first organizations need to tell Hadoop what to do. That in itself can be a major task. That’s also where workload automation comes in. By automating steps in the Hadoop job development process, enterprises can save time and prevent errors. As a result they can deliver innovation and business services faster. Depending on how Hadoop workflow development is implemented, enterprises may be able to avoid hiring staff or outsourcing to get specialized skills. For example, Control-M for Hadoop lets users build, test, promote, schedule and manage Hadoop workflows using the same interface that is used for all other enterprise batch jobs – no special Hadoop development knowledge is needed.
There are many paths financial institutions can pursue for FRTB compliance. We have heard of some organizations that have only a few people actively working on FRTB compliance and others that have dozens, and expect to need more than 100. Why the large difference? Digital maturity is a large part of the answer. Hadoop and workload automation can take a lot of the labor, risk and cost out meeting FRTB requirements, as BMC customers are proving in the real world every day.
- In Constant Pursuit of Batch Process Optimization at Ingram Micro
- BMC Delivers what DevOps Teams Desire and CIO’s Require for Big Data
- Using Spark with Hive
- Using Hive Advanced User Defined Functions with Generic and Complex Data Types
- Big Data, Big Team: The Making of Control-M for Hadoop
Want to Learn More About Big Data and What It Can Do for You?
BMC recently published an authoritative guide on big data automation. It’s called Managing Big Data Workflows for Dummies. Download now and learn to manage big data workflows to increase the value of enterprise data.