5 June 2019

The North Star and its Data (2/4)

Within Siemens like in many other companies, it is sometimes not so easy to harvest the potential of digitalization to full scale due to installed proprietary software solutions, customized processes, non-standardized interfaces and mixed technologies. But for us, this doesn’t mean that we must run a tremendous standardization program before we can use the possibilities of data analytics or predictive maintenance in our factories. 

(Read my introduction first)

To get rubber on the road at big scale, we need an architectural concept which allows us to easily develop applications, scale up and transfer solutions from plant to plant, from engineering to shop floor as well as supplier to costumer and re-use identified process insights from one application in another.

We used the “North Star” concept in combination with so called “Reference Processes” to describe exactly what we are aiming for, which functionalities we will need, and how this benefits us in return. 

The “North Star” concept is very well-known to lean-oriented people and helps to focus all your implementation activities. Therefore, the future process will be described or better designed and compared with the actual situation to define the next step, or as we called it, evaluation step. This can also be seen as a road map with different maturity levels.

The linkage of the digital twin of product, production and performance, for example, will increase efficiency of our product lifecycle management process and reduce time to market of our products. This will only be possible, if the product is entirely described by its digital twin including commercial and logistic data supported in real-time by so called design rule checks, for example during the mechanical design via NX software. While the engineers are designing, a cost model will be calculated in real-time by using a design-cost relation pattern, the consequence of a design decision. This is mainly supported by the high maturity level and interoperability of our own software solutions based on the PLM data backbon.

Another “North Star” is the full digital visualization of the complete factory with its production lines and cells including its logistic processes for material and tools by “Process Simulate” and “Plant Simulation” to design and optimize all the assets in the shop-floor right at first time.

Finally, all data generated during the production process are collected and analyzed via artificial intelligence methods to feed them back to the product and production design process for continuous optimization.  

This means, that the processes “new product and new machine introduction (NPI & NMI)” will be mainly automated, cost-optimized and reduced in time. I believe a reduction by 66% is possible.  

Another example is the full automation of our planning, scheduling and sequencing of our production orders to assure the best utilization of our assets during production regarding all internal and external operational influences. Parallel, we want to come as close as possible to an one-piece-flow and a cycle time which is equal to the customer’s cycle time. In this context our manufacturing data platform is f.ex. used for capacity balancing. In our factory network we have many electronic plants with corresponding surface-mounted technology production lines. To use this synergy, we are generating a market place where we can balance the capacity across factory boundaries.

On the shop floor autonomous guided vehicles (AGV) with swarm intelligence and robot farms form a cyber-physical system to organize intra logistics material supply and high flexible work arrangements. 

Artificial intelligence, machine learning algorithms and pattern recognition will support and enable predictive maintenance, reduction of test efforts and increased machine utilization by distributing relevant information to people via connected smart devices. In cooperation with Schmalz we already brought the above-mentioned concept to effect to realize machine utilization in our factory in Amberg, where we optimized a packaging machine using an intelligent sensor from Schmalz and Siemens hard- and software.

The data for all these scenarios must be extracted from different sources like the ERP system, Teamcenter, MES-level, SCADA-level, Industrial Edge (shop floor data), SIMATIC PLC, sensors and other yet not known sources.

Knowing all the different scenarios and target states, which we are aiming for, it becomes all the more obvious, that traditional data warehouse architecture reaches its limits. The 9Vs of big data, volume, velocity, variety, veracity, value, volatility, visibility, viability and validity are now defining data and the target is that all kind of data are analyzable data and not just structured corporate data. 

Data volume has exploded in the factories due to traceability, legal regulations or just quality assurance. Streaming data is becoming more and more important and therefore the velocity is essential. 

Today, we want to analyze data from different sources; this means the variety of data spans from video feeds, photographs, process data, test data, over log files and even text files. With this comes the challenge, which data to trust, and which should be kept and which not. Do they all need to have the same value in units and how long is the life cycle of this data? Machines are producing a lot more valuable data than what is usually collected in a classic data architecture.

Read my next blog The Manufacturing Data Platform to find out more about the new manufacturing data architecture we are implementing in our network of factories.

Related Tags