Will Data Be the New Transformation Frontier for Banks?

Use Cases & Projects, Scaling AI, Featured Sophie Dionnet

This article was first published in the FinTech Journal in Japan.

Over the past 15 years, the financial services system has gone through significant change, spearheaded by the spur in regulation that followed the 2008 financial crisis. New prudential norms, digitalization of services, acute scrutiny on operational resilience, and development of ESG are a few of the strategic priorities all banks had to tackle. 

→ Go Further: See the Full Financial Services Use Case Library

All the current players have navigated this era of change, managing to adjust their systems and operational processes to an unprecedented shift in behaviors from customers while adapting all activities to new regulatory asks. If there is one thing to take away from 2023, it is the proof that the financial sector’s resilience has been far improved. A major European bank (Credit Suisse) collapsed, the U.S. financial sector mid-market was severely shaken following SVB’s bankruptcy, inflation resumed to uncommon heights and, yet, the aftermath of what could have been an economic earthquake has remained contained. Some financial institutions emerge stronger, others shaken but, still, the system has shown that it has reached a new level of stability. 

banks

So what’s next? Is status quo the new norm, or are there still transformation drivers for financial institutions to bet on? 

Over the last 12 months, the volume of excitement and scrutiny triggered by Generative AI has been a strong wake up call that, when it comes to data and AI, the revolution is far from complete. And while this is true for all economic sectors, the potential for banks and other financial institutions is particularly strong and ready to be reaped for those who will put the right setup in place to strike the necessary balance between daring and governing. For financial institutions willing to make 2024 the year of data impact, here are four guidelines to keep in line of sight:

1. The Means to Deliver Data Transformation Are Right at Hand: 

Many financial institutions still linger on the data transformation path for fear of not being sufficiently equipped to make it happen: Is data sufficiently clean and organized? Are people trained? Have governance systems been implemented? Should full cloud migrations be completed first? Will data be the remit of a few or of all — and how should priorities be defined, value computed, and various analytics and AI approaches be qualified? 

These are all valid questions that are being actively worked on by data offices and can only be tackled as efforts continue. When taking a step back, banks and other financial institutions are uniquely equipped to engage on the path to turning data into a true transformation driver: Their level of data intimacy and structure is extremely high overall, with a strong level of data consciousness triggered by the need for exactitude across processes. The percentage of employees that interact with data daily (be it to take a decision, model risks, or assess feasibility) is among the highest. And with strong cultures of system builders and model governance, banks are particularly well positioned to put the right framework in place to properly tackle the related risks. 

Overall, all banks have a level of readiness for their data journey which is high. Some are more advanced than others for sure, and some are more willing to dare to take action. The main advice is that waiting for full readiness is not a winning strategy: Data readiness stems from action, with initiatives shaping processes, risk acceptance, and acting as the only viable catalyst to building data culture. Those who put too many gates structuring their data efforts in place take the risk of becoming laggers — not to add the connected risk of losing the talented individuals willing to make this transformation happen.

2. Balance Is at the Heart of the AI Race:

Managing investments and banking activities is all about taking calculated risks. With such a strong risk management DNA and model risk management imperatives, it is only natural for banks to have extremely high consciousness on risks linked to data, analytics, and AI. 

Seeking the perfect data-readiness prior to leveraging data and developing AI is not a winning strategy. Financial institutions yielding benefits from AI are the ones that dare to take action, in a controlled manner, with the right type of focus and safety nets. And this starts by giving people access to data as an essential step to ownership, data consciousness, and action. With the right framework and IT setup, banks have the possibility to equip broad communities with the capacity to test, learn, develop, and turn data into meaningful outputs — whether to better know their customers, manage risks, or improve operational process resilience without jeopardizing their governance objectives.

Anti-Money Laundering (AML) is a domain where there is both consensus on the potential of AI and deep fear on unwanted effects of ill-managed models. All banks are today equipped with rules-based engines. More than often, these trigger large amounts of false negatives and limited capacity to do early new risk identification, leading to high costs of alerts management and non-effective setups. Balancing approaches between keeping existing systems live and gradually enhancing them with first alert tiering, agile thresholds testing, and simplified rules to machine learning alerts comparison is critical to delivering successful transformation in this domain, with the right trust from all stakeholders, including market regulators. 

How to approach AML successfully with AI is just one of the examples of why putting data to work is first and foremost a question of balance: Where big bang approaches fail, having step-by-step, agile initiatives owned by empowered teams is the key to successfully turn AI into a driving transformation force.

3. As With All Systemic Changes, Embedding Is the Core Target: 

New technologies can long live as Proof of Concepts. But full impact only happens when underlying processes (including systems, people, and organization) change themselves.

When looking at the opportunities linked to data, the magnitude of impact can be overlooked if proper embedding in day-to-day processes is not taken into account. In this domain, the ultimate objective should be to equip current process owners, supported by business-focused data teams as needed, with the right means to drive their own process transformation. 

Such approaches can take many different forms. First, it is about ensuring that these teams have first-right capacity to understand their activities with the required industrialized KPIs and measure trends over time. The next step soon becomes mining processes to uncover inefficiencies or conformance issues, and perform root cause analysis prior to embracing the potential of predictive insights. Overall, these data techniques, from basic to advanced, can only deliver their full potential if embraced by process owners themselves and embedded in day-to-day activities and systems. Removing silos between AI initiatives and business programs with an urgency for short-term, tangible impact is among the keys for success. 

While historically AI investments have been geared towards risk management and customer growth, there is a growing trend towards directing some of these efforts towards operational resilience objectives. This is a continuous effort for banks: reducing operational costs while improving efficiency. And much has been done in the past, be it with RPA, SixSigma, offshoring, and other methodologies. Accelerating everyday use of data and AI, across processes and systems, with teams empowered to own their change agenda, opens a significant path for renewed transformation opportunities in their organizations.

4. From Generative AI as an Objective to a Data-Transformation Catalyst: 

Generative AI is at the forefront of the recent debate, and rightly so: It opens tremendous potential for financial institutions that are, by essence, high content producers. Once related risks are managed, all banks have similar objectives in line of sight: reducing operational burden (and costs), unlocking new scalability in their operations, and more. 

For some financial players that have been slower to embrace the data journey, Generative AI can seem a softer path forward. Less dependent on data, it bears apparent simpler promises to augment customized customer interactions, simplify knowledge querying, and much more. 

When it comes to comprehensive process augmentation, it’s rare to be able to rely on single techniques. If we take the example of climate change management and the need for banks to fully own and manage the carbon exposure of their credit portfolios, Generative AI will play a key role in simplifying conversion of documents into insights, but more will need to come into play. Solving this challenge will be about combining the tremendous potential of Generative AI with other more traditional data approaches, and blending all towards the unique solutions each financial institution will need. 

Overall, be it for climate change management, risk mitigation, or simply operational burden reduction, the potential of Generative AI is extensive. The advice is to embrace its potential as a flagship to accelerate the comprehensive data and AI strategies banks seek. 

Capture the Data and AI Potential in 2024 and Beyond

Thanks to more than 150 customers in the financial sector, Dataiku benefits from a unique point of view on how banks and other financial institutions embrace the potential of data and AI. All share the same fundamental conviction: They bet on what makes them unique — their employees — empowering them with means to act, while consciously building the appropriate governance to support a trusted acceleration curve. With Generative AI as a new catalyst, we can expect the move to data and AI to further accelerate, thus becoming one of the main new competitiveness factors across institutions in the next few years. 

You May Also Like

How to Build Tailored Enterprise Chatbots at Scale

Read More

Operationalizing Data Quality: The Key to Successful Modern Analytics

Read More

Alteryx to Dataiku: AutoML

Read More

Conquering the Data Deluge Through Streamlined Data Access

Read More