• Examples of common quality challenges for packaging manufacturers, and potential solutions
  • Tips for determining where their operations fall on the digital manufacturing continuum.
  • Guidance on using Industry 4.0/Digital Manufacturing/IoT to collect appropriate data and maintain product quality.

Quality is critically important for customer retention in any industry. When you lose a major contract because of a quality issue, it can be a long time before you can win that customer back, if ever.

In a recent TBM management briefing, we noted how quality is a critical factor in customer retention for packaging manufacturers . In this industry the quality challenges vary depending on the type of product.

For example, when manufacturing a polypropylene-coated paper cup or other food container, all of the components have to be printed, cut, formed and joined correctly. Tolerances are tight, especially where different material types come together, like where a paper rim has to accommodate a rigid plastic lid. Tracing the causes of leaks and other failures can point to engineering issues on the line or to paper suppliers. We’ve traced such quality problems to bare spots and ripples in the paper extrusion process, and how the rolls of coated paper had been packaged and packed for shipment.

Thermoformed plastic packaging presents other challenges. Beyond the forming process itself, the units have to be stacked and bundled correctly. Food packaging trays, for example, when loaded into customers’ machines, can’t stick together or they will jam the equipment.

The Digital Manufacturing Continuum

When it comes to digital manufacturing capabilities, companies fall somewhere on a continuum. At one end are operations that collect and store little to no machine/process data. At the other end are those that are monitoring performance and collecting data from multiple points on every piece of equipment on their lines.

In between these extremes, some manufacturers are collecting machine data but not storing or aggregating it. Others may pull together disparate data streams, but they haven’t done much with it yet. Packaging operations, paper mills, chemical plants, food and similar production operations have been monitoring process parameters in real time for decades, but they haven’t been collecting and storing the data.

Our engagements with packaging industry clients typically include a number of quality-related issues that require attention. Defects may be causing customer chargebacks and returns to increase, eroding margins. Or machine stoppages and rising reject rates are causing scrap and rework to creep upward, which is also increasing costs.

For those of us who have been in industry for a decade or more, it’s still kind of amazing to see the terabytes of machine data that can now be captured cost-effectively, especially compared to the time-consuming manual methods of the not-too-distant past. Some manufactures are capturing these datasets, but no one is looking at the data or analyzing the impact that the inputs are having on the outputs.

On one recent project the plant had been manually recording data on machine stoppages and breakdowns. When a team looked at the data, the biggest reason for line stoppage was a miscellaneous catchall code, which wasn’t very helpful to their problem-solving efforts. To improve data accuracy and reliability, they began capturing stoppage data directly from the machine PLC and sensors. With accurate time and reason codes, machine problem areas were much easier to identify.

Getting to Root Causes

For product quality issues that make it to customers, preliminary analysis should quickly determine if the source is internal or external. For packaging products, external quality problems often revolve around material handling and shipping, which could require changes to packaging, packing and loading methods.

Simple error-proofing methods can prevent external quality issues from eroding customer confidence. One of our clients’ machines can simultaneously print four streams with 4 different printed products on each stream, producing four stacks of flat printed, creased and cut cartons, which are then palletized. To prevent cartons from mixing with other products, or from being placed on the wrong pallets, they added a mini bar code at the printing stage. The code is automatically checked on the sealing machine, making it impossible for workers to ship the wrong product to customers.

Some of the costs associated with internal quality issues include extra testing and inspection, equipment downtime while issues are fixed, and the time spent analyzing and identifying root causes, which can be significant. This is where digital manufacturing technology – data monitoring, collection and analysis, specifically – can be a huge help.

Over the course of our careers, we have performed Pareto analysis on thousands of production processes to determine root causes of quality and other issues. Classification and regression tree (CART) is another analytical tool that we use regularly to identify the most important variables in a dataset.

Data collection, which doesn’t normally require a lot of resources, has always been the most time-consuming element of such projects. With today’s digital manufacturing technology, the data may already be available and only needs to be segregated and analyzed, greatly accelerating the search for solutions.

When quality issues arise and you want to conduct rapid data analysis, you first have to collect the appropriate data at the key quality control checkpoints in your production processes. You can then fill data monitoring and storage gaps. The goal of such prep work is to shorten the time between quality alerts and when issues are rectified, minimizing any financial impact. Pre-built analytical models can quickly pinpoint the key variables that are having the greatest impact on the causes of defects.

Some words of caution, however. We’ve seen clients install electronic data monitoring capabilities that then triggered an endless stream of alerts, when temperatures were running five degrees over or five degrees under the specified band, for example. It wasn’t long before operators and supervisors started ignoring the alerts because they knew that such variations wouldn’t have an impact on production or quality. This created a culture where it became routine to ignore or mute these alerts. When the line did go down, they looked at the data and could see when they might have acted to prevent the shutdown if they had been paying attention to the right data point.

The problem in such cases is not at the line level. It was created when the data collection capabilities and alerts were setup. No one asked engineers, the people running the line or the maintenance team which data points mattered most. This may seem like a basic question, but it is not asked far too often, especially when outside vendors perform the technology integration.

Even when everything stays within the target parameters and specifications, trend analysis can highlight growing problems before they become serious. This approach is similar to safety management programs that track near misses and mitigate issues before a worker injury can occur. Of course, any proactive program, whether it’s safety- or quality-related, requires a higher level of management commitment in terms of strategy and allocating resources.

Have you been able to leverage digital manufacturing capabilities to respond faster to quality issues? If so, we would love to hear about your challenges and solutions.