This site uses cookies in order to improve your user experience and to provide content tailored specifically to your interests. Detailed information on the use of cookies on this website is provided in our Privacy Policy. You can also manage your preferences there.

By using this website, you consent to the use of cookies.

Learn more
OK
Internet of Things

In Siemens Digital Industries (DI), like in many other companies, it is sometimes not so easy to harvest the potential of digitalization to full scale due to installed proprietary software solutions, customized processes, non-standardized interfaces and mixed technologies. But for us, this doesn’t mean that we must run a tremendous standardization program before we can use the possibilities of data analytics or predictive maintenance in our plants. 

(Never heard of ‘Lean Digital Factory’? Read my introduction first)

To get rubber on the road at big scale, we need an architectural concept which allows us to easily develop applications, scale up and transfer solutions from plant to plant, from engineering to shop floor as well as supplier to costumer and re-use identified process insights from one application in another.

During our LDF program, we used the “North Star” concept in combination with so called “Reference Processes” to describe exactly what we are aiming for, which functionalities we will need, and how this benefits us in return. 

The “North Star” concept is very well known to lean-oriented people and helps to focus all your implementation activities. Therefore, the future process will be described or better designed and compared with the actual situation to define the next step, or as we called it, evaluation step. This can also be seen as a roadmap with different maturity levels.

The linkage of the digital twin of product, production and performance, for example, will increase efficiency of our product lifecycle management process and reduce time to market of our products. This will only be possible, if the product is entirely described by its digital twin including commercial and logistic data supported in real-time by so called design rule checks, for example during the mechanical design via NX software. While the engineers are designing, a cost model is calculated in real-time by using a design-cost relation pattern, the consequence of a design decision. This is mainly supported by the high maturity level and interoperability of DI Software solutions based on the PLM data backbone Teamcenter.

Another “North Star” is the full digital visualization of the complete plant with its production lines and cells including its logistic processes for material and tools by “Process Simulate” and “Plant Simulation” to design and optimize all the assets in the shop-floor right at first time.

Finally, all data generated during the production process is collected and analyzed via artificial intelligence methods to feed them back to the product and production design process for continuous optimization.  

This means, that the processes “new product and new machine introduction (NPI & NMI)” will be mainly automated, cost-optimized and reduced in time by 66%.  

Another example is the full automation of our planning, scheduling and sequencing of our production orders to assure the best utilization of our assets during production regarding all internal and external operational influences. At the same time, we want to come as close as possible to an one-piece-flow and a cycle time which is equal to the customer’s cycle time. In this context our MDP is f.ex. used for capacity balancing. In the DI factory network we have many electronic plants with corresponding surface-mounted technology production lines. To use this synergy, we are generating a market place where we can balance the capacity across factory boundaries.

On the shop floor autonomous guided vehicles (AGV) with swarm intelligence and robot farms form a cyber-physical system to organize intralogistics material supply and high flexible work arrangements. 

Artificial intelligence, machine learning algorithms and pattern recognition will support and enable predictive maintenance, reduction of test efforts and increased machine utilization by distributing relevant information to people via connected smart devices. In cooperation with Schmalz we already brought the above-mentioned concept to effect to realize machine utilization in our factory in Amberg, where we optimized a packaging machine using an intelligent sensor from Schmalz and Siemens hard- and software.

The data for all these scenarios must be extracted from different sources like the ERP system, “Siemens PLM systemTeamcenter”, MES-level, SCADA-level, “Industrial Edge” (shop floor data), Simatic PLCs, sensors and other yet not known sources.

Knowing all the different scenarios and target states, which we are aiming for, it becomes all the more obvious, that traditional data warehouse architecture reaches its limits. The 9Vs of big data, volume, velocity, variety, veracity, value, volatility, visibility, viability and validity are now defining data and the target is that all kind of data is seen as analyzable data and not just structured corporate data. 

Data volume has exploded in the factories due to traceability, legal regulations or just quality assurance. Streaming data is becoming more and more important and therefore the velocity is essential. 

Today, we want to analyze data from different sources; this means the variety of data spans from video feeds, photographs, process data, test data, over log files and even text files. With this comes the challenge, which data to trust, and which should be kept and which not. Do they all need to have the same value in units and how long is the life cycle of this data? Machines are producing a lot more valuable data than what is usually collected in a classic data architecture.

Read my next blog Lean Digital Factory 3: The Manufacturing Data Platform to find out more about the new manufacturing data architecture we are implementing in our network of factories.

Comments

0 comments