The world of manufacturing is currently undergoing a transformation - commonly referred to as “Industry 4.0”, which is presenting many companies with the decision to invest in new technologies such as artificial intelligence (AI), additive manufacturing and cloud-based platforms. Spending on smart manufacturing technologies is expected to increase by nearly $300 billion by 2023, with annual growth of 12%. However, many companies are struggling to make these investments. This blog explains why this is the case for highly successful and innovative companies.
Over the past decades, one could always rely on the fact that new generations of machinery were becoming more and more efficient and that manufacturing costs could always be further reduced by new processes and materials.
The advantages and disadvantages of new technologies that aim to directly replace an existing technology can be compared and mapped as far as possible in technical key figures. The savings can be quantified in euros. What remains is the technical risk, which can be reduced bit by bit through POCs and small batches. In the end, full-scale series production often follows. Such an innovation process can take several years until the old technologies have been completely replaced.
For example, automobile components could be made from engineering plastics instead of metal. Engineers had to rethink design and production, as the different material properties had to be compensated for and completely different machines and processes were used. On the other hand, there were favorable raw material prices and shorter development cycles. However, the external geometry and the assembly of the parts initially remained very similar. Thus, they represented classic substitutes in shoring. The expectation was clear, they should perform the same at a lower price. However, the full potential of new technologies is not tapped when they are first introduced, but only when the company’s own specialist departments build up expertise and the initial skepticism and difficulties of a new technology slowly turn into the typical joy of optimization. For example, the plastic parts could now be clipped instead of screwed due to a more flexible design, which enabled further savings in assembly.
Incremental introduction for the first products and the continuous exploitation of potentials was virtually the gold standard for innovations in the manufacturing sector for many decades and will continue to be so. However, the new technologies (3D printing, AI and cloud) can only be introduced in this way to a limited extent and then rolled out step by step. Trying to do this as usual leads to problems for companies that are particularly experienced in this approach.
While additive manufacturing can in principle be seen as a substitute manufacturing technology, the idea of “batch size 1” is more than a substitute because it attacks the very foundations of the idea of batch production. It represents a mindshift in how manufacturing is thought of. However, companies often spend a long time looking for suitable components that can be directly substituted. The holistic potential of additive manufacturing is not being tapped, or only very slowly.
The situation is equally different for cloud systems. Almost all digital startups rely on cloud solutions when they want to distribute software, because software in the cloud requires radically less maintenance and software solutions are much easier to scale. It is also easier to connect to other systems than with an on-premise solution. If you sell software solutions yourself, cloud is therefore exceedingly relevant. However, even if you only use software, you have to prepare yourself for the fact that, for this reason, there will soon only be solutions in the cloud, which will have inevitable consequences for your own IT. Implementing individual use cases in the cloud on a test basis will not help in coping with the general trend towards the cloud, which will end in very high costs in the future when a supplier’s on-premise solution reaches its end of life. The full potential of the cloud idea cannot be continuously tapped. The crucial thing is just to design software to run “anyware”. This can be in a dedicated cloud, a third-party cloud or on-premise/on site, depending on current economics, but the important thing: the exact same software. Those who do not completely convert their entire software will have considerable competitive disadvantages in the future, but no advantages. Bringing partial use cases into the cloud today offers no economic advantages for many industrial applications. As with additive manufacturing, there is no natural roll-out and finding individual use cases does not automatically get the ball rolling for further substitutions to the cloud.
With AI, no physical items are substituted, but intelligence and support processes, which means that not the variable manufacturing costs, but the fixed costs are optimized and thus AI falls into the category of an investment in an ERP or MES.
If only a partial use case is implemented, e.g., a computer vision application, and this is done on a case-by-case basis, an IT wilderness is quickly created, and the maintenance and support costs are considerable. Whether you want to monitor welding seams or fruit with AI, it makes sense not only to specify the concrete use case, but also, for example, to view the object recognition algorithm as an independent asset from the outset and to manage it over its lifecycle. Essentially, anyone who relies on isolated solutions for their factory when it comes to AI is doomed. Only the exploitation of synergy effects will lead to long-term profitability. This must be considered even in the case of a partial introduction.
However, the lack of a direct substitute is not all that complicates decision-making in the company.
In terms of strategy, top management receives advice from consultants and other professional services firms with little empirical experience with store floor technologies. Operational executives, on the other hand, deal with vendors of specific industry solutions and their recommendations. However, these are mostly improved solutions that follow the idea of continuous improvement and innovation.
While the management, for example, discusses matters relating to the cloud with well-known cloud providers and is prepared to do business with new partners, even though they have strong technology but neither special domain expertise nor business-relevant use cases, the operations managers tend to rely on improved, familiar solutions from their trusted suppliers, who, on the other hand, can hardly credibly demonstrate that they can offer comparable solutions on the subject of software.
Between the strategically thinking management and the operationally thinking plant management, these vendor preferences end up in long, political concept phases and delay important investment decisions.
Decision-makers must perform an ROI calculation for investments. If this is not possible, it is automatically at most an R&D topic, so that it must also be managed from this budget.
Now, however, it is the case that research or development is no longer necessary for all three technologies and solutions can be purchased and experts hired. However, the R&D budget is usually used to acquire enough knowledge through preliminary development so that a build or buy decision can be made in the interests of the company. However, R&D is always good when it deals with smaller and delimited technology topics.
Only one in five to one in ten of such R&D projects makes it into series. And that’s a good thing. That is part of a good innovation concept. But does that also apply to the three technologies? Is there only a 10% chance for each of the technologies to make it into series production? If the odds are greater than 10% that you can’t do without it in the future if you don’t want to suffer economic disadvantages, then investing in AI, cloud and additive manufacturing are not R&D matters, they are business decisions. The decision to introduce these technologies in one fell swoop to entirely new factories, while obvious, creates another problem for particularly successful manufacturing companies.
The idea that a smart factory can be achieved simply by deploying these technologies is too simple. Manufacturing companies are not starting from scratch, nor are they constantly building new factories on so-called greenfield sites. To wait for such a venture to come along and then implement that factory in a particularly smart way is to miss opportunities to learn in the Brown Field. Very successful companies in particular are accustomed to introducing new technologies incrementally so that the necessary, empirical experience and the new know-how to be acquired have time to develop naturally.
In the manufacturing sector, there have been no radical new innovations since the 1990s, such as those now on the horizon with AI, cloud and additive manufacturing, only incremental optimizations. However, companies have invested billions over the past few decades in implementing agile methodologies, enterprise resource management and other types of IT systems such as MES and digital twins in search of the next big leap to improve process flows, increase visibility and make on-demand manufacturing feasible and profitable. Yet many manufacturers have failed to take full advantage of these investments or simply did not solve the fundamental problems that needed to be addressed. This leads to innovation fatigue, especially among successful companies that have invested particularly heavily in this area recently. After all, one expects great success from great efforts. Now suddenly having to tackle three new or further revolutions at once, which seem to have appeared out of the blue, is for many a bit discouraging. After all, they are still dealing with the issues that have already been tackled, and the necessary resources and bright minds are therefore still tied up.