Intelligent process control in bell annealer furnace facilities
/ Digitalization, Innovation & Technology
EBNER Industrieofenbau | Richard Speletz, Model Developer
Intelligent process control in bell annealer furnace facilities
From physics-based models to digital twins.
Precision heat treatment is one of the critical factors determining the quality of modern steels. Bell annealers play a central role in it, particularly when there is a need for the flexible and energy-efficient processing of smaller charges or special alloys.
We recently interviewed Richard Speletz, Model Developer, who explained how physics-based models, digital twins and AI algorithms will revolutionize control systems and optimize bell annealer furnaces – providing everything from intelligent responses to process faults to precision temperature control.
Could you describe the process employed at a bell annealer, along with its importance to the steel industry?
Bell annealers are used to heat treat steel coils and steel wire. Precisely defined thermal practices enable specific mechanical properties, including the hardness, strength and ductility of the steel, to be achieved.
In comparison to continuous heat treatment facilities, bell annealers are significantly more energy efficient and significantly more flexible – particularly for small charges. They are well suited to a wide variety of steel grades, coil sizes and production requirements, as they not only support a wide variety of processes but also allow temperatures to be precisely controlled.
Before modeling was introduced, what was the greatest challenge for process control?
One of the greatest challenges was the inability to measure the temperature in the interior of a steel strip coil: within the interior of the furnace itself it was only possible to measure gas temperatures. However, due to the thermal inertia between the gas and the material there could be significant temperature differences between them.
Without accurate knowledge of the material temperature, accurate process planning is extremely difficult. In turn, that leads to longer downtimes, increased energy consumption and increased amounts of scrap – particularly when sensitive alloys are involved.
How did you approach the creation of a physics-based digital twin?
The first step was to develop a comprehensive mathematical model that incorporated subsystems like the heating bell, inner cover, burners and steel coils. Input data included burner output and blower speeds.
In practice, customers specify recipes – so the model was extended to incorporate internal control circuits. This created a digital twin of the facility, which is in a position to process recipes automatically and accurately. In this way selected material properties can be targeted by controlling the heating cycle.
Which differential equations and physical principles are used?
The system is based on the nonhomogenous two-dimensional heat equation, which is derived from the principle of conservation of energy and Fourier’s law.
The model accounts for the conduction of heat within the steel coil, as well as the transfer of heat within the system through both convection and radiation. This provides a physically correct representation of thermal behavior throughout the entire process.
How does the model contribute to better planning and to reducing scrap?
Thanks to the model, the temperature of any point within the steel coil can be calculated – even for points at which it is not possible to take a measurement. This allows the temperature profile to be accurately predicted, optimizing the scheduling of future annealing cycles.
Results include increased throughput, reduced energy consumption and a significant reduction in the amount of scrap.
What kinds of sensor data have been integrated into the model?
A wide variety of real-time data is fed into the system, including temperature measurements from the heating bell and process atmosphere, fan and blower speeds, gas (H2, N2) flowrates, time stamps and other signals relevant to the process. These data form the basis for dynamically adapting the process parameters.
How does the system react to real-time faults, for example when a burner fails?
The system detects deviations between the setpoint temperature and the current temperature of the atmosphere. Based on both current and past sensor data, the system calculates adjustments to the thermal practice in real time. The new annealing times are then automatically uploaded to the Process Control System.
This allows the product quality to remain consistently high, even if there are unexpected faults.
In your system, how does the Bayesian optimization algorithm work?
The degree of freedom allowed to the physical model can vary, and can be optimized during the process.
For example, the cooling phase is defined by five parameters. During cooling, the Bayesian algorithm tries to minimize the deviation between simulated and measured values by adjusting these parameters. As soon as deviation has been minimized, the optimized parameters are used in the next calculation performed by the model – providing a more accurate prediction of the time when cooling ends.
As the algorithm learns from every iteration, it continually improves and every cycle allows it to provide even more accurate results.
What factors were behind your decision to shift toward neural networks, and how do they supplement existing models?
Given the large amount of data that is generated by the calculations performed by our models, neural networks can be efficiently trained. They are capable of calculating solutions extremely quickly, and can thus take over tasks now performed by the optimizing algorithms.
However, they do not replace either the physical model or the digital twin – they supplement them. While the model ensures that physical laws are observed, the neural networks speed calculation and enable fine adaptive adjustments to be made.
What are the next steps in the development of this technology?
The next step in development is to integrate neural networks directly into the existing model, allowing process control to be even more responsive and adaptable.
Our goal is to have both approaches, Bayesian and neural networking, ready for the market. Above and beyond this, we would like to extend the approach to cover other uncertainties in the process – for example, to identify atmosphere flow characteristics when a stack has a complex shape.
In the future, what role will artificial intelligence play in optimizing thermal processes?
Artificial intelligence will play a key role, particularly in:
- Adaptive process control, to automatically react to malfunctions and
- Predictive Maintenance, with predictions created by recognizing patterns in sensor data. We should also mention
- Hybrid systems, which combine physical models with AI support to optimize parameters.
Physics-based models will continue to form the foundation, but AI will make them faster, more reliable and expand the scope of their applications.
What kind of feedback have you received from customers that are already using these models?
Feedback has been extremely positive. Customers report a significant improvement in their ability to plan, shorter processing times and a significant reduction in the amount of scrap.
In addition to all this, modeling offers improved flexibility when testing new annealing cycles – without affecting process safety. It clearly brings added value to daily operations.


