August 01, 2022
Much ink has been spilled about the banking industry’s slow roll to the cloud. Whereas enterprises in other industries long ago began adopting cloud and hybrid cloud environments, stringent financial regulations surrounding banks, and notably, investment banks, have largely limited the industry’s cloud work to software development & testing, and customer relationship services.
But cracks in the sky are forming and in an interesting twist, a forthcoming compute-hungry regulation, the Fundamental Review of the Trading Book (FRTB), could be the catalyst for opening the cloud to investment banks at scale for the first time.
Drafted in the wake of the financial crisis of 2008, the FRTB provides guidelines to help ensure banks remain solvent with sufficient reserves in cases of emergency. Scheduled to be implemented globally in January 2023, the regulation will require banks to begin calculating significantly more complex and risk-sensitive data surrounding the ‘trading book’ which deals with maturing assets, such as equities, debt, commodities, foreign exchange, derivatives, and other financial contracts.
This will be easier said than done, however, as investment banks lack the computational muscle – the IT – to support these compute-intensive risk calculations. In fact, few enterprises have the kind of firepower within their data centers to handle such extreme workloads.
Computational muscle, that’s where the cloud can help, as it has helped countless enterprises in other industries. And the move has already begun. One financial institution with whom we work, recently leveraged upwards of 150,000 virtual cores on AWS to begin handling its rising computational work.
Another enticing aspect of cloud is that ability to take advantage of consumption-based pricing, or paying for what you use, only when you use it. This is extremely attractive to investment banks, whose workloads in the risk computation realm are sporadic, varying by the day. When computations aren’t underway, banks would have little use for idle servers. But by shifting these new compute-intensive risk computations to the cloud with consumption-based pricing options, banks can reach economies of scale that would be impossible with capital expenditures for the necessary servers.
When considering the computational and consumption-based pricing attributes, the synergies between cloud and investment banks arguably couldn’t be better. But that’s not to say it’ll be easy.
Managing such cloud and infrastructure processes can be as challenging as the compute-intensive workloads themselves. Our approach is to assume from the bank the burden/management of the necessary information technologies (IT) and operations technologies (OT) that surround the risk computation programs and processes – from the software to the cloud infrastructure at the hyperscaler. For example, through our Hitachi Application Reliability Center (HARC) technologies and services, we are able to establish a virtual risk computation hub, of sorts, for investment banks to manage the software, cloud and infrastructure as a managed service.
Less than two years ago, Austria’s Raiffeissen Bank International (RBI) embarked on cloud transformation journey with Hitachi Vantara to rebuild its entire application landscape to be more accessible, agile and productive. Today, everything at the bank is built with a cloud-native mindset. That meant moving away from monolithic, large applications running across multiple locations and toward a more microservices-based and layered API type of architecture; one that can be shared and scaled across its entire banking network of 14 banks across Central and Eastern Europe. The HARC team of experts is well acquainted with such extensive and integrated cloud projects with financial institutions like RBI.
In addition to IT and OT management challenges, is the data challenge. Before banks of any size can do anything in the cloud, a complete classification of data is needed. Several different categories of data confidentiality help dictate which data can sit in the public, which data can be public but confidential, and which data must be private. Only after such categorization takes place, can work begin to plan a cloud initiative. Miss this step, and it could be costly. One of the larger European banks had decided to move the majority of its applications and data to the Google Cloud Platform. As it began the migration process, however, it learned it was required to obtain regulatory approval from no less than 70 different jurisdictions before going further.
Alternatively, Hitachi has worked with quite a few financial institutions on hybrid data approaches that include robust data categorization and then strategic placement of data in the most optimal locations – on premises, in the cloud or both. As a result, banks can provision the kind of on-demand elasticity associated with the cloud but maintain the processes within the data center. It’s the best of both worlds.
Although finance, and investment banking in particular may have been sluggish in its embrace of cloud computing, this pivotal sector of the global economy may very well become the case study in coming months and years for business transformation during disruption.
Be sure to check out Insights for perspectives on the data-driven world.