Surmounting the Toughest Analytics Hurdles for Financial Institutions

Scaling AI Sophie Dionnet

Financial institutions are data-driven by nature. All their core processes — including customer suitability assessments, credit allocation decisions, and liquidity buffers management — are dependent on data accessibility and AI models to make the best risk-adjusted business decisions. 

This data intimacy should give financial services institutions a head start towards delivering steep acceleration to full analytics and AI embedding, but there are significant hurdles that remain. 

bank

Let’s Start With Access to Data

Data is the bread and butter of financial institutions, with data structures that come from the historical composition of business models and underlying information systems. As a result of this, data is often organized by products and activities. 

Histories of mergers and acquisitions have also influenced the foundations of information systems, creating legacy burdens and barriers to eased data access. This, combined with a high level of regulatory administration surrounding data access, is the first significant barrier to analytics development.  

A first step is for organizations to recognize that scaling with analytics starts with broadening access to data. Banks and insurance companies are rightfully extremely reluctant to do so for a variety of reasons including regulation (e.g., GDPR), absence of central warehouses, perceived risks on infrastructure resilience, and more. 

However, it remains a critical step to deliver the right acceleration. This can be done by combining the agility of open experimentation spaces with strong governance to gate criticality assessment and move to production. 

One of the main benefits of this strategy also sits in the capacity for data management teams to move to an evidence-based approach. Why? Because all businesses will claim that their data is of the utmost importance, in the same way all teams will always ask their tools to have a P1 status on recovery plans. 

Leveraging evidence on tangible usage is a powerful way to lower administrative discussions and painful qualification efforts. This, of course, needs to be done with the right type of technology structure to lower all related risks — be it from a data access standpoint using sample data, down to security and access control.   

The Importance of Upskilling

Another main barrier to analytics development is linked to upskilling and trust. The average data literacy level is high in financial services institutions amongst modeling experts. However, for others, the move to business embedded analytics requires a shift in mindset, as well as possible technology upskilling and change management. 

A good example is best practice on how to deal with absent data. There are some domains where taking proxies will be perfectly acceptable, and others where taking proxies would be poor practice. If there are no prices and characteristics for all the instruments traded on a particular day, there are times when making an estimate can make a lot of sense (for example, to estimate margin calls and risks).  

However, in some cases, “guessing” vacant data can have a significant impact on decision making. Companies willing to embrace analytics must invest in the upskilling of their employees and build a suitable collaboration environment to organize exchange and controls between risk experts, business professionals, and data scientists to develop well-controlled initiatives.

How Is the Financial Services Sector Achieving Success With AI? 

The first to embrace the AI journey were the investment teams, who — in their constant search for unique market insights and investment models — have seen in AI a unique opportunity to innovate. While it has been very successful for a few, it has also led to many unfruitful initiatives and has, to a certain extent, led to the misconception that AI is only about innovation and cracking highly advanced market topics. 

The financial companies that have been most successful with AI are those who focus their AI initiatives on the “day one solving topics” such as operational processes optimization, customer analytics and customer journey enhancement, risk management across all dimensions, and more.

After more than 10 years of deep regulatory transformation, all financial players have significantly enhanced their risk frameworks. But much remains to be done across all dimensions. The successful integration of AI in risk management has played an essential role in supporting reinforced robustness of the banking system, including agility and impact in investigations, development of new internal controls, and enhancement of financial crime monitoring through analytics, to name a few examples.

AI is also a real revolution within risk assessment, notably through the enhanced use of alternative data. This is true both for traditional risks and emerging risks such as climate change, helping all financial players — banks and insurers alike — to reconsider how they price risks. Those who have developed a strong expertise in leveraging alternative data and agile modeling have been able to truly benefit from their investment during the ongoing health crisis, which deeply challenged traditional models (notably on scoring for corporates).

Lastly, the positive impact of AI on customers should not be underestimated. Financial services are confronted with an aggressive competitive landscape as well as demand from customers for improved personalization, driving improved customer orientation in these organizations. The capacity to build complete customer views and optimize customer journeys, notably on claims management, are two examples of areas where AI has significantly supported deep transformation within banks and insurance companies, and there are plenty more opportunities waiting to be explored. 

Overall, analytics and AI remain a significant opportunity to yield for most. The fact that we are seeing AI and analytics make a more frequent exit from data labs to be fully embedded in business lines shows the motion is there. However,  there remains much to do, and there is a race out there between players to see who will seize the full potential first. My bet will be on those who decide to overcome tangible and perceived barriers to data access, with combined emphasis on governance, a decisive focus on systematic process enhancement.

You May Also Like

🎉 2024’s Superlative Awards: 7 Dataiku Features That Stole the Show

Read More

The Dataiku GenAI Features Revolutionizing Enterprise AI

Read More

5 New Dataiku Features to Streamline Your RAG Pipelines

Read More

Dataiku Is a Gartner Peer Insights Customers’ Choice

Read More