Industry 4.0: proven philosophy and new possibilities in structural durability
The networking of individual areas and machines in a company in order to detect any deviations in the production process at an early stage is part of what we have known for several years as Industry 4.0. This networking aims to increase the understanding of the individual process steps so that corrective measures can be implemented at an early stage. The individual process steps are digitized and the corresponding digital mapping or digital twin used to analyze the effects of each individual step along the value chain or along the entire life cycle. But are these thoughts and principles completely new? Can Industry 4.0 and digitization learn from structural durability? A look at the history of structural durability and of Fraunhofer LBF helps answer this question.
Pioneers of big data analyses
Since the founding of Fraunhofer LBF as a laboratory for structural durability and before its inclusion in the Fraunhofer Society, Ernst Gaßner published on the importance of taking account of the complex, variable operating stresses as a precondition in the construction of lightweight structures. He noted that the key success factor is the correct coordination of material, design, production, and load. Right from the beginning, it is pointed out that these influences do not have a serial effect, but rather influence each other. Recognising this correlative complexity, for the last 80 years Fraunhofer LBF has worked successfully to identify and quantify the key influencing factors.
Ernst Gaßner’s main idea – i.e. jointly considering the load and load capacity or stress and stress capacity under a variable operating load – led to the development of the “8-stage block test”. This made it possible to use the testing machines available at that time to demonstrate stress factors on the test bench and the method remained the structural durability test standard up until the advent of servo-hydraulic testing systems. For the “Gaßner test”, knowledge of the operational load is required – a prerequisite that, in the period between 1940 and 1975, led to the development of the corresponding measuring devices needed to detect the operating loads. Such devices included the Svenson’s contact strain gauge and the classification methods. These are used to determine and compare the influencing variables that are relevant for the structural durability. Examples include the average values and frequency of amplitudes of different magnitudes of any load-time functions. The signals are consolidated through different mathematical methods and the results displayed graphically, whereby resulting information losses are taken into account. Today, one would no longer resort to a term such as “classification method”. Instead, it is now referred to as “big-data analysis” – but the procedures and results are the same.
One challenge in experimental structural durability is the handling of what are sometimes very small sample sizes. From a mathematical point of view, it is impossible to derive robust characteristics and specify accurate scatter bands. In the 1960s, Erwin Haibach presented the standardized Wöhler line based on preexisting experimental results and experiences. Today, this method would also be described as a result of big-data analyses.
Early digital twins
Numerous research projects have shown that one prerequisite for accurate lifespan assessment of cyclically stressed components and structures is the consideration of the manufacturing process chain or its effects on the cyclic material behavior and thus on the component behavior. To emphasize the importance of the interaction between material, design, production, load and their influence on the structural durability, a department called “Component-related material behavior” was founded in 1984 at Fraunhofer LBF; this name is still an integral part of the institute’s organizational chart.
With the emergence of computers, the analytical methods were expanded significantly, and towards the end of the 1980s Fraunhofer LBF, using numerical manufacturing simulation, was able to numerically examine the influence of different production parameters during rolling. The experimentally verified and validated models therefore made it possible to simulate, over the lifespan, the influence of manufacturing parameters, which are often inaccessible experimentally. This set the basics for a digital twin, i.e. the numerical mapping of a component with all its essential properties. The commercial marketing of a digital twin lay, however, in a different area.