Stop Trying to Predict Yield, Improve It

Scaling AI, Featured Christine Andrews

The cost of quality averages between 15% to 40% of sales. With those kinds of numbers, it’s hardly surprising that many manufacturers place yield improvement as a top priority. For semiconductor and pharmaceutical manufacturing, 1% yield improvements can translate into tens of millions. No manufacturer argues that waste is costly and yield drives tremendous value. The question, as with anything, is the investment needed to drive these improvements.  

The focus on yield and waste reduction is not a new idea. Shewhart was one of the early pioneers to focus on process control in manufacturing. His control charts are still in use today.  In the 1980s, Deming popularized TQM (total quality management). In the 1990s, the term lean quickly became synonymous with waste reduction. 

Fast forward to 2024 and many are talking about predictive quality using AI. The theme is clear. As technology has evolved, so has the approach to improving quality — or so it seems. Paper-based reporting, unwieldy spreadsheets, and post hoc analyses are still the mainstay for many enterprises. That isn’t because these tools are easy and optimal. It’s because they are known and familiar to the people closest to the shop floor where yield is ultimately determined.

This is the challenge many organizations face — addressing shop floor needs with top-floor-driven initiatives. It’s a change management problem and a technology problem. And you can’t solve one without addressing the other. Maturation of quality and yield management is happening in various industries and is addressed most successfully in a phased manner. A common evolution that enables the shop floor to deliver on top floor yield objectives can be broken down into three phases:

1. Computer Vision for Defect & Quality Testing

The first easy step to improving yield is capturing quality and defect information accurately. Are you throwing away good product? Are you catching bad product too late, after you’ve added more time, materials, and energy into its production and eventual scrapping? Defect detection that uses computer vision is a relatively easy way to address this issue.

Next, automatic rejection of bad product is a common practice in many industries where margins and compliance pressures demand closed loop control. Smart cameras are growing in popularity and there is technology off-the-shelf that can be quickly and easily leveraged to address defect identification of raw materials, sem-finished, and finished goods — all of which impact final yield. 

Incorporating more vision systems into final product quality testing has a significant impact on reject rates, avoiding the subjective decision making that often causes good product to be quarantined or scrapped. Leverage off-the-shelf tools where it makes sense and then, for the remaining use cases, build models that can be easily adapted to a wide variety of products and processes or the high-value areas that operations identifies as most problematic.    

scientist testing machine

2. Diagnostic Analytics for Root Cause & Yield Improvement

When it comes to yield, early intervention is key. Decisions about when to reject product, pause production, or abort a process completely are usually made based on operator experience and institutional knowledge. Post hoc understanding of why some runs or products result in low yield while others don’t requires either vast experience or painstaking root cause analysis investigations.  In an era of unpredictable turnover, that means yield will invariably suffer. 

Centerlining has been in place to help with some of this knowledge sharing around processes and products, but their challenge is with scale and sustainment. The amount of data can be vast. The number of variables that impact quality can range in the thousands. Quality and process engineers often look at only a handful of these most impactful process parameters.  Are they missing some key operating parameters? That’s one of the questions AI and ML can help address. It’s why many organizations start with validating SOP control ranges and machine settings using historical data. It can create a more active, broad, and dynamic centerlining program that can be managed and supported centrally. The results allow ML models to be integrated with recipe cards and bills of process for improved production execution, which means better yield.

The technology to support quality and operations teams with improved starting points based on past results is a relatively easy way to start integrating data science with operations for yield improvement. Validating setpoints and control settings using the actual automation and control systems data doesn’t require a heavy MES deployment. And diagnostic analytics that provide suggestions on what variables are most correlated with poor yield can create a tremendous amount of value.    

3.Prescriptive vs. Predictive Modeling

You need to predict so you can prescribe. Forecasting yield and quality can lead to a variety of actions, from halting processes to adjusting equipment settings in real time. This leads to reduced costs and improved margins. The first consideration should be what action is expected and how the predictive model supports that action. Otherwise, you can have a great, performant, and accurate model that results in no net change in the status quo.  

To translate prediction into action requires some aspect of prescriptive decision support. That can be simple as an alert to advise of sub-optimal production processes. Or it can be a set of corrective actions recommended based on operational health. It can also result in automatic adjustment of equipment settings to affect closed loop control. MPC and APC are such examples that have been employed across various industries for many years to help maintain process control based on prediction. They are typically limited in terms of variety of inputs and considerations and often require significant investment.  

The predictive modeling is then undertaken to support some definitive alert and/or corrective action. That also means you need a feedback loop in place to ascertain the value of the model output and the propensity of the operations teams to intervene. Feedback loops are a cornerstone to continuous improvement and short and long-term success of any predictive yield effort.  

Start with simple feedback loops that indicate whether any action was taken. Use A/B testing here as well to ascertain the value of following prescriptive recommendations or not. Only by validating that following a model’s advice is better than not doing so can you really start gaining buy-in across the organization and see some measurable improvement in the yield metric. And it’s okay if the model results don’t show anything conclusive. This simply means you’re not looking at the right data or incorporating enough variables. Maybe it means you need to go back to step 2 and refocus on diagnostic tools.

It’s Okay to Go Slow

Yield prediction is the holy grail for many industries with high raw material and processing costs.  In others, it’s the area driving down their OEE and represents a big cost reduction opportunity.  Regardless of the priority and margin improvements involved, scrap and waste are sunk costs.  For that reason alone, it’s a big focus for most manufacturers. But getting to the point where savings can be realized means taking a strategic and gradual approach.  

When large organizations try to jump all the way to prediction without thinking about prescription and never having addressed the key diagnostic questions that plague many engineering and quality leaders, it often frustrates data and operations teams. The best approach is one that addresses the needs of the shop floor personnel who ultimately determine the final yield number. Provide those teams help they need and want. And do so with their knowledge as the foundation of yield improvement initiatives. 

You May Also Like

Digital + Data/AI Transformation: Parallels and Pitfalls

Read More

Stay Ahead of the Curve for GenAI Regulation in FSI

Read More

Improving the Delivery of Therapies in the Healthcare Market

Read More

Taking the Wheel Back With Dataiku's Model Override Feature

Read More