Robotic Process Automation (RPA) and Artificial Intelligence (AI) are two of the growing trends in technology, pushing new boundaries of how companies and businesses can transform their operations in an ever-increasingly digital world. In our previous blog post, we outlined the key differences between RPA and AI, as embracing either technology can help organizations to improve performance and reduce costs in different ways.
RPA has core strengths in automating mundane or repetitive tasks, which may either be the same for each iteration or include some additional complexity such as rule-based logic. AI technologies, in a similar vein, can be embraced by various teams for complex analysis of past data to predict or optimize future outcomes. Data scientists and citizen data scientists can work together with the business to ensure that any AI outputs are in line with business expectations.
Since these technologies can coexist, what are the scenarios where both can be combined? Your organization may already be at the stage where you have both, but you’re wondering where the overlaps and integrations are — how do you get the best from both technologies?
A Faster Route to Automated Intelligence
Some RPA platforms will already include some built-in intelligence which is useful for scraping information from images or documents (i.e., to perform relatively more complex cognitive tasks). But there are also opportunities where each technology will feed into the other, such as including an AI prediction in a wider RPA process that touches various business departments, or integrating an RPA process into an automated machine learning (ML) flow so another RPA process can be triggered, depending on the outputs of an ML model in production.
Benefits of combining both technologies mean that wider business processes can be orchestrated while being cognizant of any known (or unknown) business impact. Businesses can use AI to gain a holistic view of and insights from data and can then use patterns within that data to arrive at faster and more informed decisions about the business — all while embedding that into existing, accelerated processes enabled by RPA.
In this blog post, we will walk you through a project where information is shared between both the RPA process created in UiPath and an AI workflow and API endpoints created in Dataiku. You’ll see how to create and leverage the integrations between the two platforms in a way that gives users the choice over how the integration is leveraged. The UiPath process can be triggered from within an automated Dataiku workflow, and the next steps for each process repetition depend on an AI prediction, bringing a true partnership of automation and predictive power.
It Takes Two
We can combine these two different platforms by using Dataiku plugins and UiPath packages, which enable connectivity between the two platforms if you are already set up and working on them.
Although Dataiku is a very comprehensive platform on its own, it is extensible via Dataiku plugins. These offer customers the flexibility and capability to accommodate a wider range of use cases with speed. One of these plugins is our UiPath Orchestrator plugin, which lets the user include an automation scenario step to kickstart an existing process on a UiPath robot. From within Dataiku, we can install the plugin from the plugin store, create a preset using the UiPath user credentials and Orchestrator API details. Then, when building a scenario, we have the option to add a “UiPath Orchestrator - UiPath Job” step to specify the robot and process details to be triggered, which can be included in our automatic scenario.
Similarly, modularity within UiPath is achieved through the use of packages, which allow users to add specific activities and tasks in a project. Here, we’ll be using the Dataiku - DSS Activities package within UiPath, which allows the robot to query and retrieve an output from an existing Dataiku API node. Once the package is installed (from Manage Packages in UiPath Studio) then you will be able to add the relevant “Query DSS API Node” activity, including the relevant Dataiku API endpoint address, API Key, and query as a JSON string. There is also the option to set a timeout threshold and define the variable name of any output received from the API Node.
Let's Put Things Into Practice
The premise for this walk-through takes the perspective of a retail organization whose data science team has used Dataiku to create an existing customer lifetime value predictive model, which is a sample project you can create yourself in Dataiku. In this instance, the data science team has trained some predictive models against some historical consumer spending. The team has used the customers’ past spending data, web visits, location, and whether they were part of a marketing campaign to help the sales team more easily predict how valuable a customer will be in the future (which can be a useful sales activity). These models will be exposed as API endpoints on a single API service using Dataiku, which enables authorized users and systems to leverage the existing model.
The sales team has a list of new customers, their country, first spend, and some other metrics for Q2. Their details must be shared with the relevant stakeholders if they are predicted to have a high lifetime value, which is where we will rely on the prediction from our AI model. The full process — going from the raw list of customers, obtaining their individual predictions, and then sharing the details of only the highest-predicted customers — can be automated by an RPA process created in UiPath.
Once the various data sources are combined, we can use Dataiku to produce two different predictive models within the Visual ML Lab, so that predictions of lifetime value can be made against new customers. Using a choice of open source models, perhaps we have found that the XGBoost and random forest models are the most suitable; these can be deployed and made available to the flow as green diamond icons:
From the flow, we can deploy the models as separate endpoints on the same API service. There is the choice to deploy to a new or existing service, enabling queries to be made from outside Dataiku, such as from other applications. The same can of course be done for a Python function, if we had produced something in code. For this project, take a note of the endpoint URL, as we can use this to introduce user input within UiPath.
In order to automate the start of the RPA process, we can add in a step within a scenario, which can be triggered in Dataiku by certain factors or outcomes. Here, let’s use the “UiPath Orchestrator - UiPath Job” step within a scenario, and specify the process and robot we want to kick start.
Within UiPath Studio, the first thing to do is to download the Dataiku package, which makes the “Query DSS API node” activity available for use within the project workflow. There are various ways to extract data and introduce end-user interaction within UiPath Studio. For this project, we’ll take our input data from a spreadsheet and introduce an input dialog, allowing users to select which prediction model ultimately gets used (i.e., perhaps the API endpoints service separate teams).
First, we’ll start by selecting the algorithm which we want to be used within the RPA process. UiPath allows for user inputs either as a text box or multiple choice but, here, we’ll go with an input dialog with multiple choice to allow for easier algorithm selection. We’ll give users two options: Random_Forest_pred and XG_Boost_pred, and save this choice, as it will come in handy later down the line. The choice that we have provided is simply the name of the API service which can be found within the Dataiku API endpoint URL.
Now, we’ll set up the relevant activity steps to read the customer data we want to predict with. Each row in the Excel table represents a new customer whose age, web history, purchase, and geographical data will be used to generate a prediction at the API. Therefore, we will first need to convert a row of data into the relevant JSON format to be accepted by Dataiku.
This can be done in UiPath, for example, by looping through each row, saving the data as a data table, and then converting that to JSON using the following activities. We also have the option to save individual fields from the row as values, for future use, such as our “Customer ID” field below:
Once we have the required JSON query, we can now use the Dataiku activity to query our API node.
Here, we can enter the endpoint URL, and cast some dependency on the initial user input. We’ll use the saved value from the input, and include it within the string for the endpoint URL.
We’ll enter the API key obtained for the API service within Dataiku (this is the same regardless of endpoint), and specify within the properties of this activity whether we want a time limit before the action times out, and what variable/value the returned JSON should be saved as.
API queries in Dataiku are returned with a specified structure depending on what the endpoint function is. For this example of a predictive model, we can expect the format “result: prediction” which makes it easy to extract the predicted customer lifetime value of each of our new customers and save as a variable. Since the sales and marketing team is only interested in customers with a high lifetime value, we can specify some next actions only if the predicted customer lifetime value is over a certain amount, such as $200.
This is done within the same loop, using some additional activities to save this prediction value for each customer, and include it in a further “if” activity to add the next steps if our condition (Predicted_Value > 200) is met. Of course, there are plenty of connecting activities and systems UiPath can automate and link to, but for this example we will simply add the details and predictions of each of our customers into a PowerPoint deck. Let’s add a new slide for each new customer and populate it with the relevant information.
Recall that we previously saved a value “Customer ID,” which we extracted from the Excel sheet row data. This can be inserted as a text field within an existing slide. We can also add in custom text, calling different values from the Excel sheet and also including our saved predicted value:
The result is that we now have a presentable format to share our previously raw data, as well as any meaningful business impact predicted using the Dataiku model, which has all been automated:
To Wrap Things Up
We have used this Dataiku project to externally trigger an RPA process which generates a prediction from a model deployed as a Dataiku API service for new customer data. The predictive model outputs can then be integrated further into a company’s wider existing processes.
At the end of this second blog on RPA and AI, we can see a few differences from where we finished off our previous blog. For example, the use cases covered last time involved AI being integrated within the RPA robot itself and ultimately used (i.e., using computer vision) to extract information from unstructured data, such as written text. This extracted information is then used within the RPA process and existing systems, for example to determine whether a fraud claim should be paid out. Whereas here, the robot may already have used existing internal computer vision capabilities to extract information from an image, but we are adding another step where this information is used to predict a wider business result or outcome.
There are further examples where this kind of combination between an RPA process and a Dataiku AI workflow can be useful. For example, a fraud prediction model could be deployed into production using Dataiku and utilized by key risk stakeholders. If the model predicts a detrimental output for the business, Dataiku metrics and checks can trigger an RPA process to reevaluate existing processes and eliminate the risk.
Similarly, a complex supply chain process can be automatically diverted and adjusted by an RPA robot, such as if an AI model predicts a pending part failure or a change in inventory level due to external factors on supply or demand. These are processes that might normally be timely and difficult to do — as they require manual intervention and decision making from business teams — but which are much more simplified thanks to these technologies. By combining both process automation and predictive intelligence, much more agility can be achieved in all aspects of the business, and we hope you start to see the opportunities too!