Why Analytics & AI Should Be Core to Your P&L Optimization Strategy

Use Cases & Projects, Scaling AI Sophie Dionnet

Despite the economic potential from AI and analytics touted by consulting firms and even the media (in life sciences alone, per McKinsey, AI is estimated to have the potential to deliver up to $441 billion of value), let’s face it: most organizations aren’t there yet when it comes to getting massive, quantifiable value from AI.

→ Get the Ebook: Transform AI From a Cost to Revenue Center

In times of economic stress marked by ongoing inflation, geopolitical uncertainty, and an unprecedented shift in consumer patterns, organizations understandably turn inward, focusing only on those core programs or initiatives that will see the company through to the other side. For many businesses, this means AI and analytics initiatives are on the chopping block. 

Here’s why this approach is short sighted, particularly for applications within all data-native teams such as finance, supply chain and marketing teams who have the very power to leverage AI for more efficiency and real cost gain.

Impacting P&L With AI

Converting macro trends into decision making means finding a way to analyze sub-contributions and drivers within one’s business model. When it comes to the potential of AI, there are three main sources of P&L impact. 

The first is analytics and AI projects that are forward-looking revenue generators. By better understanding consumers, delivering better products and services, improving market understanding, or by innovating with new value propositions, AI has strong potential to act on revenue increase. 

Fueled by refined usage of data, these projects trigger forward-looking strategies by both optimizing current business initiatives and supporting the development of ancillary business models. Initiatives such as omnichannel marketing or next-best offers fall into this category (for a more concrete example, look at Showroomprivé, who leveraged Dataiku for machine learning-based targeting to build marketing campaigns that are 2.5x more effective).

The second are cost avoiders. Be it by lowering the cost of risks, limiting use of external services, or reducing exposure to incidents and hazards, AI and analytics are core to avoiding unwanted spending. Common examples are quality control, improving pharmacovigilance, enhancing anti-money laundering detection, better managing brand reputation watch, and more. 

These first two types of analytics and AI projects are about prospective P&L impact — in other words, AI or analytics systems that could bring ancillary revenues (or avoid cost increase) if they are successful. However, as we saw in the introduction, even though bringing this type of value in the short term isn’t a guarantee, that doesn’t mean it’s not worth investing — there are ways to ensure building these types of use cases isn’t prohibitively costly, as we’ll discuss later.

That’s why the third and most important driver in times of economic uncertainty is cost reducers. That is, analytics initiatives can actually help optimize P&L by acting on essential cost drivers such as full-time employee (FTE) count, IT spending, the use of external vendors, and more. Typically what will be achieved by ensuring a given business process such as fraud analysis that used to take 50 FTEs now can be performed by 30 FTEs thanks to advanced analytics to refine alerting and graph powered investigations. 

Deep-Dive on Cost Reduction

In times of P&L stress, CFOs are focused on making sure results from company initiatives are both material and timely. That doesn’t necessarily mean cost reduction as the new and only absolute, but it does move it to the forefront of corporate priorities, both as a way to preserve competitiveness and to ensure sufficient means to fund targeted revenue growth initiatives. 

AI can deliver significant acceleration on cost reduction initiatives in three ways: 

  1. Speeding up analyses and detecting inefficiencies. Anyone who has worked on delivering operational efficiencies knows that formulating a recommendation starts with having the right insights — and here, analytics play a critical role. Having agile access to data to understand past track records, using modern techniques (empowered by machine learning) to best identify critical factors, or leveraging process mining to have a data-driven approach to process understanding are essential first steps to assessing and taking action. 

  2. Streamlining data processing in business processes. As the need to monitor and report on company activities across multiple dimensions has grown, so has the development of complex processes heavily relying on data ingestion and processing. More often than not, the gap that must be bridged between systems and the level of human intervention and tedious, manual work required remains significant, with much opportunity to rationalize and reduce overheads.

    Finance teams are the perfect storm for such opportunities. As they aggregate data to understand costs and revenues as well as forecast landing and simulate budgets, finance teams own many manual processes that are not only costly but also prevent them from fully developing their business partnering potential. The good news is that empowering these teams with analytics can be a game changer. As an illustration, Standard Chartered Bank was able to revamp its financial forecasting processes through enhanced analytics processing, enabling two FTEs to pick up the work previously performed by 70 FTEs alone leveraging Dataiku.

  3. Enhancing processes with AI-powered approaches. On top of simply revisiting how data-heavy processes happen, process efficiency can also come from applying end-to-end data engineering combined with advanced data science techniques. Dataiku has notably seen its customer Malakoff Humanis deliver proven reduction in daily time taken to process its claims from one full day to one hour thanks to a combination of NLP and deep learning to accelerate claims routing.

    Predictive maintenance is another example of AI applied to streamline cost structures. For example, at Oshkosh Corporation, engineering teams use Dataiku to create predictive maintenance analytics that save their customers time and money by keeping vehicles on the job and out of the shop. Early pilot program results show a potential 2-3x improvement in required maintenance intervals, which translates to more than a 50% reduction in downtime and support costs over the life of the vehicle. 

The Cost of Cost Reduction

While the potential to leverage AI for cost reduction is appealing, it’s naïve to ignore the cost of the AI initiatives themselves. In order to generate short-term, material impacts on one’s P&L, is there a way to lower overall analytics and AI project costs, making sure that these benefits are delivered more efficiently? Put more bluntly, in a given year, how can you be sure cost reduction benefits outweigh investments? 

Of course, there’s no easy answer — it always depends on the organizational context, including the projects, nature of the expected output, capacity to materialize the savings themselves, etc. But there are ways to lower the investment cost and reinforce the likelihood of positive outcomes.  

the data science lifecycle infographic

The traditional data project lifecycle generally begins with an analysis phase, followed by planning, execution by IT teams, validation from business owners, and push to production (all while maintaining a clear vision of the project’s goals and key success metrics throughout). The systematic reliance on IT, however, from the need to create analytics environments to specifications and execution, can be a massive roadblock and decelerator.

This is where platform-powered agile analytics and AI come in. As process owners, business professionals empowered with the right tools can themselves become change agents, lowering both time and cost to impact. To be sure, such initiatives of course still require support to be successful, both in the form of robust foundations from IT as well as from data experts to guarantee resilience and scalability. However, a platform approach more easily allows masses of knowledge workers to tackle more advanced analytics matters quickly, maximizing scalability and reuse of data sets, artifacts, or even entire projects.  

All industries and business functions can benefit from this approach:

  • In banks and insurance companies, which are by essence data native and that can see significant benefits from improving all their data-driven processes. 
  • Across finance functions that are in charge of processing the right data to better understand business performance, support decision making, and deliver optimized P&L. 
  • In customer analytics teams, which continuously process data yet often lack efficiency in doing so. 

Beyond FTE-related efficiencies, such projects also give an opportunity to challenge processes and underlying IT tools, opening the path to additional opportunities for massive, systemic cost savings. 

How Dataiku Fits In

At Dataiku, our customers successfully accelerate on these types of initiatives by leveraging the platform’s no-code capabilities to empower business users in reinventing their ways of working and in boosting their productivity. Plus, streamlined collaboration between data scientists and business professionals paves the way to revamped and optimized processes using the full potential of AI.

In addition, Dataiku’s data ingestion and analytics capabilities concretely lower the cost of compliance with critical new regulations such as IFRS. Customers re-internalize analytics processes that they use to delegate by leveraging pre-built solutions. And by doing so, they deliver tangible benefits to support their company’s P&L optimization efforts — and more.

You May Also Like

The Ultimate Test of ChatGPT

Read More

Maximize GenAI Impact in 2025 With Strategy and Spend Tips

Read More

Taming LLM Outputs: Your Guide to Structured Text Generation

Read More

Maximizing Text Generation Techniques

Read More