Laura Stridiron explains how, post-pandemic, big data solutions will transform food and beverage manufacture
For any food and beverage manufacturer, lapses in product quality are an ever-present threat. Minor variations in raw materials, temperature, environmental conditions, the minute malfunctioning of highly-calibrated process equipment or systems – they all affect quality. The problem is compounded because in today’s complex processes, the causes are frequently hidden from human view. Unless, that is, companies employ advanced data analytics.
Take the example of an agricultural products company with three parallel production lines. One of these lines suffered seemingly inexplicable variances in quality, causing losses of more than $400,000 per year in unusable product. It took the analysis of 1,100 data points to work out the cause was a shared cooling water source. This was causing the problems that consistently afflicted line 3. The solution was to install a dedicated cooling water line for line 2.
The analyzed data was not only generated by the process, it included information about the entire production facility. There were eight months of data to analyse, but it required only a matter of days before the source of the problem was revealed.
Big data capabilities are having a big effect
This is one example of how big data – the analysis of masses of data with sophisticated, yet user-friendly tools, is transforming the efficiency with which the food and beverage industry optimises product, processes and plant. These are capabilities much in demand. Maintaining quality and ensuring good manufacturing practices (GMPs) despite all the variables has become more a more pressing requirement.
The food and beverage industry faces ever-stricter regulatory requirements, such as the Food Safety Modernization Act (FSMA) in the US, and the new trend for ingredient traceability. The faster any manufacturer can turn raw data into actionable insights to ensure these requirements are met, the better – especially when it does not require major investment in specialist data-science teams.
Covid-19 has taught the importance of learning how to maintain efficiency in tough conditions
The Covid-19 crisis has re-emphasized how important it is to have these capabilities when seeking to maintain quality and overall equipment efficiency (OEE) through labor shortages and consumer hoarding. The heightened importance of disinfecting, air-filtration, employee hand-washing and sanitizing is also here to stay. Nobody wants to see repeat of the costly Covid outbreaks in meat-processing plants across the industry, for example.
Increased transparency and constant improvement through more effective use of data are the key to meeting these challenges while increasing profitability. An uplift in OEE as little as five-to-ten per cent yields huge results for some companies. It is a message that is finding a receptive audience. Data has been collected passively over the years but only more recently has it become possible to provide significant value in a short time-frame, boosting quality assurance and increased facility uptime in significant ways.
Historically, data was kept on separate systems in the food and beverage sector, which made acquiring insights a complex task requiring specialist skills. These challenges have now been met through the development of solutions that enable almost anyone to be trained quickly to interpret or influence data. This is especially so in asset performance management (APM), where tools to help with downtime-avoidance, production quality, throughput, yield and risk analysis are extracting more out of existing data.
Preventive maintenance and AI are taking efficiency forward
Among the most significant trends in this field are preventive maintenance and the application of machine learning and artificial intelligence (AI). The latest APM tools deliver actionable insights through a technique known as prescriptive analytics, used to predict when a piece of equipment is likely to break down and also to determine its cause, enabling manufacturers to take remedial action to avoid downtime. Such tools typically also deliver a reduction in safety incidents and shrink maintenance costs. By improving equipment and process availability these tools generate a rapid return on investment. Cloud and Edge connectivity are also important to enable APM software use in facilities with aging equipment. This allows for easy adoption of older assets without huge capital investment.
The most immediately effective solutions can be plugged into existing systems in a matter of days, analyzing and providing insights from data that already exists, rather than requiring installation of hundreds more sensors in a process or production line. Such solutions address all assets, not simply those appearing to be most critical. The analysis must cover both the asset and the end-to-end process of which it is a part.
The distinctive feature of the best AI solutions
What makes the best of these machine learning or AI-powered solutions stand out from the also-rans is the constant self-tutoring ability to improve by analyzing both anomalies and failures in a process. A solution must be capable of developing anomaly and failure agents for target equipment and assets to enable more accurate detection of anomalies and to outline explicit patterns of failures, along with prescriptive advice on correcting the discovered conditions.
As the food and beverage industry emerges from the Covid-19 pandemic, the significant trend towards greater use of big data and implementation of advanced analytics and preventive maintenance will continue. The companies that get ahead fastest and most profitably will be those making the right choice of solution that bring results quickly and easily from thousands of data-points.
Laura Stridiron is Senior Product Manager at AspenTech, a leading software supplier for optimizing asset performance. Its products thrive in complex, industrial environments where it is critical to optimize the asset design, operation and maintenance lifecycle. AspenTech uniquely combines decades of process modelling expertise with machine learning. The company’s purpose-built software platform automates knowledge work and builds sustainable competitive advantage by delivering high returns over the entire asset lifecycle. As a result, companies in capital-intensive industries can maximize uptime and push the limits of performance, running their assets safer, greener, longer, faster.