In many sectors of the economy, the age of Industry 4.0 and the Internet of Things (IoT) has not only begun but is already well underway. That said, there is still a long road ahead of us. The future of automation is digital and offers many ways to cut engineering efforts. But what does that mean and how do we get there?
To improve product life-cycle management (PLM) processes, we are pursuing three interconnected approaches. The first is a digital twin of the product. This involves the achieving 100% of product data (including commercial and logistical data) in the digital twin – with design checks occurring constantly in real-time.
The second approach is a digital twin of the production. Here, the idea is to apply the digital twin not simply to individual products but to the entire factory set-up – including production cells and lines.
With the help of tools such as Process Simulate, Plant Simulation or PLC SIM Advanced, it will be possible to optimize all value streams from the beginning.
Finally, the vast amounts of data that are generated in manufacturing processes will be analyzed by artificial intelligence (AI) and – using a closed-loop approach – fed back into product design and plant design. Edge and Cloud Computing will play an important role here.
The digital twin of performance occurs in two main ways:
1) How well is the machine/factory performing? How much downtime/uptime have we had over a certain period?
The second aspect is harder to measure:
2) How good is the quality of the product? To what extent are quality issues related to production processes as opposed to the quality of the raw materials?
The ultimate digital goal is to continually connect and automate PLM tasks (through workflows) and thereby significantly reduce time to market, time of change (in the case of redesigns) as well as planning and development time. Find some nice examples from the machine building industry here.
AI helps engineers
Even my short introduction of this blog post implicates: There are several different, sometimes parallel paths to make the most out of todays and tomorrows technologies. Autonomous systems represent one aspect of it, but they aren’t the whole story. In the future the use of AI in form of proactive engineering assistants will help to reduce engineering efforts. The systems will actively learn from available data and experience on basis of routine knowledge and user interaction.
Self-optimizing machines promise huge advantages for plant operators. When a system is in operation, optimizing, retrofitting or upgrading automation solutions always require further expert input.
This situation changes, however, once machines learn how to interpret their environment and – within defined limits – start to make their own decisions. Autonomous functions in both engineering and plant operation make the detailed work of engineers and operators easier, and enable the plant to perform intelligently, both during operation and during phases of optimization.
Artificial Intelligence in automation can boost productivity to the next level. Thanks to a process of continuous learning while in operation, systems will get better and better. It will be possible to automate tasks that could not have been automated before. That means gains in productivity that have been unthinkable until now.
All the various aspects of artificial intelligence will reduce the time and effort spent on programming and development – and plants can be made a lot more flexible and modular.
Where do we stand today?
A scenario as described above is a long-term goal, occurring in gradual steps, and we are still beginning this process: product data is being collected; manufacturing plants are also reflected in the digital twin, both processes are ongoing. In terms of performance, the correlation back to product and plant will eventually be automated. Currently, feedback is still performed by people. But we’re getting closer. In our factory in Amberg, we implemented two great examples of a closed loop approach. If you’re interested in, how we did it, read the following blog posts:
The broad issues we face today are how to increase data consistency in all three areas – part of the problem is the low-level of interoperability of the relevant tools in historically grown IT/OT Infrastructures. Reference processes like NPI (New Product Introduction) are still only partly covered by digital workflows. And PLM sub-processes are still being activated manually. In my next blog post I will refer again to our Factory in Amberg, where we started to streamline PLM processes, which we see as an important prerequisite for a digital enterprise. Stay tuned!