Who Should Deploy My Data Science Models?

Scaling AI Ankit Singi

Deploying a new data science model in production is like installing a new light bulb in your house.

If you want to be independent and make it a DIY project, you will need a ladder, socket, wiring in place, and a new bulb. You can then easily insert a bulb firmly into the socket, turn it clockwise to ensure it fits properly, and then turn on the power button. It’s easy to do and you can take care of future replacements as well.

But what if you don’t have the socket installed? What if the wiring is broken between the socket and the power button? What if you don’t have the ladder to climb and insert the bulb? You will need an electrician to do the installation and help you out. 

changing a lightbulb

If I compare the above scenario to a model deployment process, a socket is the underlying infrastructure, the wiring is the data flows and pipelines, the ladder is the data science platform, and a new bulb is the model you want to deploy. You (a data scientist) could do it by yourself or depend on the IT team (an electrician) to help you out. 

The process of deploying and maintaining machine learning (ML) models in production is called MLOps. As MLOps has gained prominence in recent years, a lot of materials are available to explain the process and steps associated with model deployment. One can look at it from different lenses, but basically you end up doing the below high-level steps in an MLOps lifecycle.

1. Design Phase

This is where you identify the business problem, gather requirements from the users, and identify the data sources required to build the model. Once that’s done, you start to design your data pipelines, build features, and then build, iterate, and test your data science models.

2. Industrialization

Once you are satisfied with the model performance, you need to put measures and checks in place across your data pipelines and put model validation parameters in place to make the model deployment more robust and performant.

3. Deployment

Up next is the deployment phase where you conduct User Acceptance Testing (UAT) from both a business and IT perspective. Then, you deploy the projects across multiple environments and ensure the new models deployed don’t disrupt the downstream business applications.

4. Monitoring

Once models are deployed, you need to enable monitoring from a technical (infrastructure, outage, high availability, etc.) and functional (model drift, model performance like AUC, precision, etc.) perspective. Sometimes you might need to fix job errors or even roll back to a stable model version.

5. Model Management

Once your initial models are operational you need to plan for iterations and continuous delivery of ML models in production. When the performance of the model degrades or the model expires, subsequent selection of the updated models need to be triggered through the CI/CD process.

What Is Loosely Defined Is Who Will Own Deployment and Monitoring Steps

Data scientists know the ins and outs about the models but, generally, they consider their job done once the model is designed and depend on other teams to do the deployment and monitoring process. IT teams are generally best equipped to solve issues around infrastructure, resolving production deployment issues. However, there are functional monitoring considerations when models run in production like checking AUC and defining a model retraining strategy that take ML expertise that IT may not know. 

Also, if you are looking for more flexibility, it's better you install and monitor your own bulb (to go back to our analogy from earlier). Hiring an ML engineer will take care of all your model deployment issues might not be the solution for all your problems. An ML engineer might be well versed with the whole DevOps / MLOps process but will have very limited knowledge of what is going on with the model and how the underlying infrastructure needs to be tuned.

Understand the Criticality Associated With Each Data Science Project

  • “Mission critical models”: These models serve your customer base and the cost of error could result in loss of revenue 
  • “Internal business models”: Mostly used by the internal teams where the speed of model deployment is more important than robustness

frequency of the update and types of MLOps

The traditional route is to involve your IT teams from the start and let them take care of all the model deployment and monitoring processes. They will follow the set guidelines and ensure the mission critical systems adhere to a proper framework. It is a recommended approach when you know the stakes are high because even a small error can result in loss of dollars and trust from your business. But it is not a very flexible route and, if you are looking to get hundreds of models in production every month, depending on IT might not work out so well.

To ensure more flexibility towards deploying models in production, data scientists should take the end-to-end responsibility of designing, deploying, and monitoring their models in production. This will ensure autonomy across the model lifecycle process and speed up the process as it removes the bottleneck of handing over monitoring and version rollbacks to the IT teams. 

But handling of mission critical systems is not a data scientist’s core area of expertise and if an infra component goes down, there can be a lot of eyebrows raised. A more sustainable methodology is to have a collaborative approach where data scientists work towards deploying the first iteration of their model in production and then transfer the ownership of maintenance activities such as model monitoring, lifecycle management, and governance to the IT team. The key here is to ensure proper communication across data and IT teams and assign proper roles and responsibilities to, ultimately, achieve a seamless transition.

You May Also Like

🎉 2024’s Superlative Awards: 7 Dataiku Features That Stole the Show

Read More

The Dataiku GenAI Features Revolutionizing Enterprise AI

Read More

5 New Dataiku Features to Streamline Your RAG Pipelines

Read More

Dataiku Is a Gartner Peer Insights Customers’ Choice

Read More