Big data, analytics, the Internet of Things…  These are terms we’re all hearing a lot these days, and they signal big changes on the horizon, especially for those of us involved in the ethanol industry. And with all the talk about how the Internet of Things will power the next industrial revolution, or how big data will change manufacturing, some of us in the industry may feel that we’re not ready for these changes.

That’s certainly how I felt when I first heard these terms, but once I dug into what they meant I realized that for the most part, they’re nothing new. Take big data for example: It’s just a term to describe large volumes of data gathered on a day-to-day basis, usually by businesses. And analytics? It’s just a fancy way to describe what the business does with the data: Good analytics can help businesses make better decisions and improve how they work. The average ethanol plant gathers large amounts of data across its processes every day, then analyzes the data to troubleshoot and optimize processes. So big data and analytics are things we already do.

Huge volumes of data
To get an idea of just how much data ethanol plants gather and analyze every day, mouseover the icons in the ethanol process flow diagram below. And this isn’t even the full picture of the huge volumes of data most plants gather, as this simplified diagram doesn’t show all the process steps, or all the data sets gathered at each step.

process-step

Ethanol process flow

Pulling in and making sense of all this data is obviously not something that can be done without some sort of automated system, which is why distributed control systems are as much a standard in ethanol plants as they are in other manufacturing processes and plants. In these systems, all the data gathered at all the process steps shown above is fed into central supervisory equipment. So far, so good. But when you consider the sheer volumes of data continuously being fed into the central equipment – as well as the data being fed into the plant’s lab – you get some sort of idea of how difficult it is to pinpoint and understand the data that matters; data that will help a plant troubleshoot and optimize effectively.

Snapshots rather than the big picture
So because most plants just don’t have the time or capability to analyze and understand all the data they’re capturing and recording, they do the next best thing. They aim to analyze understand some of it by taking data from samples, then try to build the most accurate ‘big picture’ possible from those samples. The problem with this approach is that samples give a snapshot of a very short time period, and whether they truly reflect what’s happening continuously is a question of luck. To take some of the guesswork out, plants take as many samples as they can, and make decisions based on the average results across the sample data set. All ethanol plants have automated this process to some extent, with computers plotting graphs over time to show trends, but it’s still not an optimal process.

Troubleshooting and optimization
Take troubleshooting equipment malfunction as an example: Working off trends means that most plants only find out there’s a problem after the piece of equipment breaks down or – if they’re lucky – when it’s just about to, leading to expensive stoppages and downtime. With a system that allowed plants to ‘see’ and analyze the data in real time as it flowed in from throughout the plant, those kinds of problems could be spotted well in advance and avoided. Another example is optimization through key performance indicators (KPIs) Most plants use some sort of daily scorecard system to measure their KPIs, and I’ve got personal experience of the value of these systems from my time as director of operations at a 100+ million gallon/year plant in the US. Implementing a better system helped my team take an already high-performing plant to the next level. But improving these systems is extremely time- and resource-intensive, which means many plants don’t benefit from them as much as they could.

Rapid, real-time tracking and analysis
And that’s where all the buzz about big data that we’re hearing today becomes relevant to ethanol plants. Big data may be old news to us and other manufacturing plants and processes, but there’s a revolution happening in the tools available to analyze it in these sectors. Modern real-time plant analysis software identifies and tracks data relevant to your KPIs rapidly, in real-time and across much larger data sets. Algorithms transform the huge volumes of data contained in continuous plant/process data feeds into relevant analyses of critical data, and visualize this data in dashboards to allow for immediate understanding. By allowing plants to see and understand the patterns and meanings across the entire data flow in real-time, the software supports rapid and well-informed decision-making. Plants can pre-emptively troubleshoot to prevent problems rather than reacting once they occur, and they can recognize the factors that drive their performance and develop continuous process improvements accordingly.

In the next article of this series, my colleagues Laurie Duval and Rachel Burton, who are also on Novozymes North American biofuels technical service team, will be giving hard examples of how advanced analytics can help plants increase yields and optimize enzyme dosing. And in the final article we’ll be offering insights into the potential of the Industrial Internet of Things to improve manufacturing efficiency and facilitate better, faster decision making.

Frank Moore

After spending 35 years in the processing and business development segments of the ethanol industry, I now work as a Novozymes Plant Consultant in doing my small part to help our customers reach their next level of efficiency.

Latest posts by Frank Moore (see all)