The future of the banking industry is in flux. Financial institutions face a downward trend in fees, greater competition from non-banks, and disruption across the industry. The rise of an interconnected global financial system that includes a complex web of institutions, markets, trading partners, and instruments has brought with it a need for better, faster, and more powerful computing at tremendous scale.
In this context, data — and how it is being stored, accessed, and analyzed — will be the deciding factor for the fate of many in the banking industry. The financial markets are fueled by processing of information with increasingly sophisticated approaches that yield better results around scenario and risk analysis.
A major bank IT executive recently told me, “The bank is moving from an analog world to a digital world.” In an analog world, a banker would approach a customer with a paper pitch book and compose an idea. In the digital world, information is at a CFO’s fingertips and executives can make decisions (“Is it time for a secondary?” “How do I want to structure debt?”) in real time.
There’s the rub. Everyone is focused on these modern capabilities but, in order to support this more interactive and fluid type of banking, old infrastructure models need to change — fundamentally.
Most forward-thinking banking organizations are looking at the cloud as a new canvas on which to draw the future of the data center. The cloud represents a monumental shift and change. With the public cloud, the customer no longer has to build and deploy data centers. Maybe that sounds like a simple statement, but it’s actually quite revolutionary. The public cloud, and its ability to provide tens of thousands of processors on demand, is the key to next-generation financial efficiency — and it will ultimately lead to better investment performance.
At the heart of the value proposition of the cloud is agility and efficiency. Think about this statement from a common-sense perspective. Consider that nearly every company in the world is out there building its own data centers. That’s like every company building its own furniture. Of course, this redundancy is inefficient.
We now see a small number of companies from the consumer space getting into large-scale data centers (Amazon, Microsoft, and Google (News - Alert)). They have scale that is 10X or even 100X larger than any single enterprise data center, and they are building massive data centers that support as many as 10 million servers. When building data centers at that scale, a significant amount of automation goes into orchestrating and managing these systems. The servers in the cloud are anecdotally running at much higher efficiency, whereas the servers running in custom-build data centers hiding in a bank’s four walls are running at a very low utilization rate — 20% is not an uncommon number in an enterprise. The net wattage and power of running an application on the cloud is a small fraction of what it is on-premises. All of this automation and technology results in lower unit costs on a fully utilized basis.
The real magic of the cloud, however, is the agility it provides. By running on this massive shared infrastructure and paying only for resources actually consumed, the cloud gives financial institutions the flexibility and elasticity to run very large-scale analyses — often with tens of thousands of processors — but for finite periods of time, such as between market close and market open. This flexibility and efficiency enables forward-thinking institutions to create meaningful competitive advantage by using computing in ways that simply were not practical just a few years ago. The public cloud represents a radical change not just in the technology, but also in the way institutions use the technology to drive their businesses.
While change creates new opportunities, it also unleashes fear. To date, financial institutions have been restrained in their adoption of the public cloud because of data privacy and security concerns. Some organizations resisting the public cloud believe it’s better to build their own private environment and control it themselves.
The reality is that’s just not true. The interesting thing about security is that we associate protection with a physical manifestation. In other words, if I can touch the box, it’s more secure than if I can’t touch the box. While there was a time when this need for physical security might have been true, those days are gone. Most major breaches have happened in on-premises data centers under the company’s total physical control. Look at the major attacks over the past few years: Sony, Home Depot, JPMC, Target (News - Alert). All were on private, maintained (in many cases very well maintained) infrastructure — proving that physical control of an asset does not equate to its security.
Think about it: Building a fort is an antiquated security model. Putting a fence and a guard around a server isn’t safe enough for today’s threats. In today’s world the new boundary or perimeter comes from software-based encryption. Encryption is becoming a fundamental requirement for a dynamic, cloud-based world. When all data is encrypted all the time, the issue becomes where and how you manage the keys, not the actual terabytes or even petabytes of data.
Properly implemented encryption provides a baseline of data protection that is far more relevant then padlocks or guns. Furthermore, encryption can be used a point of policy enforcement. For example, if a dataset is to be accessed by the trading organization but not by the analyst group, that policy can be encoded into the key management system. Each time a read request comes in to that dataset, the encryption keys need to be accessed, and the policy can be checked. In plain English, the system will say, “I see you’re trying to read a dataset that is accessible only by the trading group, but you aren’t in the trading group, so access denied.” The powerful aspect of this strategy is that the policy is attached to the data itself via encryption. So, wherever the data moves, gets copied, gets backed up, etc., the policy goes with it. It is an extremely powerful and robust security model, built entirely in software and spanning heterogeneous clouds.
Cloud adoption is growing in banking. Some firms are experimenting with their first workloads, while others are totally reinventing their IT models around cloud models. More and more organizations now see that we are in a fundamentally new computing paradigm — a shift as big as mainframe to client server and then client server to mobile apps.
Data analysis and storage and the technologies that enable them to happen securely and quickly will allow financial institutions that move to the public cloud first to win over their slower-moving counterparts. While all companies will be at least partially on the public cloud in the future, there will be winners and losers in the race to get there first, and it pays to have the first-mover advantage. You can bank on that.
Edited by Stefania Viscusi