Blog

Blog: Variability Reduction: Why Important To Manufacturers?

Written by Team ProcessMiner | Dec 6, 2020 5:11:39 PM

 

A routine issue faced by most manufacturers is their process variation. Variability in the process can wreak havoc on product quality and customer satisfaction. Also, it has a severe impact on revenue, cost, and margins.

In the highly competitive manufacturing market, the champion (industry) will be the one who has a strategy to mitigate this variability.

 

Focus on Variability Reduction Strategy

Variability in the manufacturing process is the difference between the produced quality measure and its target. High variability leads to either waste or excess production costs.

Unfortunately, due to their stochastic nature, the process variability in manufacturing systems is unavoidable.

 

But it is controllable and with the right strategy, can be minimized.

 

For example, think of paper manufacturing. An important paper quality measure is thickness. Due to high process variability (as shown in Figure 1), sometimes the thickness quality is less than the lower specification limit determined by the customer — resulting in a loss of sales.

To avoid the loss, operators often overfeed the machine with more wood fiber. This pushes the process mean upwards resulting in higher production quality (see Figure 2).

However, this strategy is extremely costly due to the overconsumption of raw materials. Therefore, the right strategy is to reduce process variability. As shown in Figure 3 below, reducing the variability reduces the out-of-limit instances. It further allows us to tighten the production target range (product specification limits) to further save material cost, waste production, and increase the throughput.

However, the major challenge in reducing process variability is an operator’s inability to measure the product quality at all times. Most manufacturers perform quality tests at the end of the production cycle time or in long-time intervals (e.g. 45 mins or more in the mills we have worked at). In paper mills, the operator takes a sample from a reel of paper at its completion and then proceeds to test it in a lab while the manufacturing process continues. Following this, based on the lab results, the operator makes changes in the process accordingly. However, if the quality test fails, we have then already lost a batch of product.

This lag in the lab results can be very frustrating, which results in both an issue and an opportunity for all manufacturers. Why? Because the results from the lab tests are used to adjust the process settings. The subsequent adjustments assume that all variables involved in the process will remain constant and consistent with the variables recorded at the time of the test. However, the reality is that most of them, for example, speed and temperature, will most likely change.

This causes the operator to either always chase changes in the quality variables by making continual adjustments in the dark, or to set the controls to a certain “recipe card” and wait for the next lab test. There must be a better way to manage this process. Viewing the reel of the paper example above as an opportunity can help us address the lag issue with a new, different, and more effective process.

 

Below are a few questions worth considering:

  1. What if the quality could be predicted every 30 seconds instead of every 45 minutes or more?
  2. What if process variables like speed, temperature, raw material, etc. are incorporated into providing real-time predictions on the product quality?
  3. What if real-time recommendations are sent to operators and production supervisors to achieve quality targets and reduced variability? What if it leads to a closed-loop?

Today, companies with Industry 4.0 technologies, like ProcessMiner Inc. in Atlanta, Georgia, are working with manufacturers to layer their real-time predictive analytics and AI solutions on top of complex manufacturing processes to reduce variability, increase profit margins, and improve customer satisfaction.

 

The results are staggering but how does it work?

In most cases, ProcessMiner can utilize existing sensor technology and historian architectures to drive data to their secure cloud-based analytics platform. This means no development or hardware changes for the manufacturer. Additionally, ProcessMiner brings industry expertise to the manufacturers’ process. With this platform, there is no need to hire additional data scientists and process engineers for deployment.

The ProcessMiner solution is constantly ingesting data to deliver clear quality predictions coupled with real-time recommendations on process changes. This ensures production quality is consistently delivered, and variability is reduced through a comprehensive yet easy-to-understand user interface that delivers timely data to both the operator and supervisor.

You might be asking“Could the ProcessMiner solution work for our manufacturing process?” The simple answer is, “Yes, most likely it will.”

 

Questions to consider when looking at a real-time predictive analytic solution:

  1. Do you have quality measures clearly defined and are there existing sensors that collect data along with your manufacturing process?
  2. Are you unsatisfied with your ability to meet those quality standards?
  3. Is there a financial impact on your organization if quality measures are not consistently met? Have you determined what that financial impact is?
  4. Have you encountered barriers in developing or researching a machine learning/predictive analytics/AI solution?
  5. Have you invested in “predictive maintenance,” and looking for a solution that predicts and improves quality?

If your answers to any or most of these questions are yes, your team may benefit from ProcessMiner’s advanced AI to reduce variability in your production processes.

Originally posted on Medium.