Product Development: The Case for Digital Twins
Product development is core to business success: we covered this relationship in our previous article on “Good design, good business?”. In the following article, we make the case for a novel approach to enabling better product development processes within manufacturing organizations – the case for “digital twins”.
Today, digital twins are an operational tool used to represent a product’s properties and status in a live setting. They provide companies with the ability to detect physical issues sooner, predict outcomes more accurately and potentially build better products. These digital footprints include massive, cumulative, real-time, real-world data measurements across an array of dimensions that are put together on modern, scalable technology.
What if we used the same principle with a focus on supporting designers and engineers so they can make more informed decisions about their products across aspects like regulatory environments their products will be sold in, alternative technologies that may give them a cost or environmental advantage, or market requirements that should drive design criteria?
In other words, what if we could use data to represent and model a product’s financial and non-financial performance, allowing teams to quantify and improve business outcomes together, in real-time, right from first designs?
Development strategies today
Modern products are increasingly complex, customer specific, competitive and technology advantages are becoming smaller. Companies are scrambling to keep up or get ahead. In this environment, the most significant drivers of product success broadly rely on the quality of execution, the creation of sustainable product advantage and competent cross-functional teams.
For most companies, these goals translate into a strategy around increasing development speed and product success rates, while reducing complexity and business risk from early on. Different companies approach these with different degrees of focus depending on how they are positioned, but they all invariably integrate core elements of each into their strategies.
For example, some companies focus more on standardization and modularization programs to simplify manufacturing and design with the aim of reducing lead times, and production costs while increasing quality and efficiency. Others focus on better ingredients/designs and sustainable supply chains & operations with the aim of differentiated product offerings and reducing operational risk.
A common theme to all strategies, however, is the pursuit of better communication about the product as early as in the design phase as possible, with the goal of reducing innovation cycles (and therefore time to market) and creating more successful product designs through cross-functional solutions to market requirements. This goes beyond messaging or project management tools and is typically brought about through attempts at centralizing design, manufacturing and market information across the organization.
So, what is the problem?
The balancing act slows things down
Modern product design and management are extremely tedious due to the difficulty to collect and connect heterogeneous product data like risks, impacts, and costs to designs. Any change to the design modifies the equilibrium of all other criteria. Teams constantly transfer data such as design (CAD, PLM), materials, environment & health (EHS), risk, compliance, sales (CRM), and procurement between systems and suppliers with the aim of finding decent compromises across the multitude of design criteria. That constant back and forth slows down new product development and therefore is very often skipped/short-circuited in many organizations.
Adaptability is not easy, nor cheap
What further complicates matters is that companies today are compelled to adapt to an ever-increasing breadth of market requirements (the “tightening noose”), many of which were not at the forefront just a decade ago. These considerations include things like stricter regulations, new science about health concerns of different materials etc.
All this is new and complicated for engineers. For example, circular economy considerations mean that the afterlife of products and their environmental impacts must be thought through from the design stage. This information is not just difficult to come by, but also difficult to understand for non-experts. The solution that most companies adopt is to add additional specialist resources at the end of the design chain that typically end up being not as integrated as they would need to be to derive significant value.
Current tools don’t really help
Some companies spend millions on customizing ERP and PLM systems to suit new requirements and supplement data gaps with expensive purchases, which for most organizations is not just too costly, but also with average implementation times of 2-4 years, is just not a here-and-now solution. Tools that exist today are single purpose/department, expert tools that are very often offline in nature – none of which support the need for collaborative yet secure interactions, and cross-functional solutions to modern day design challenges.
What results very often is stop-gap solutions to new challenges at best, and procrastination of addressing the challenges at worst. Both these approaches only delay the inevitability of losing the competitive advantage to younger and more agile companies as we have seen in almost every sector today.
The digital twin
It could be argued that the concept of the digital twin was born out of the need to bring together disparate information in real-time, to understand how different environmental and performance characteristics interact with each other. This simplified the task of predicting how products would behave under different conditions (CAD). It also filled the gap that sensor systems left – understanding how multiple factors interact with each other to affect performance. Having all that data centralized and time-synchronized removed the complexity of manually connecting the data together to find relationships and simply diverted cheaper and faster computational power to the problem. It also solved the issue of adding more data sources/considerations into the mix dynamically. Teams that employ digital twins, spend more time trading insights than data.
Why not use the same approach for design criteria? That’s what we do at Makersite. We connect best-in-class data for costs, markets, risks, materials, regulations, environment, health, suppliers and more to create a “digital twin” of product design or formulation. Tools are built-in to create reports, apps, and maps to visualize clearly the data that matters.
That way dependencies between product ideas and their business performance become immediately transparent. Once the digital copy has been created, data flows continuously from several sources into a single system. That allows teams to collaborate on the same project and understand the impacts of design changes on parameters outside of their core expertise.
Combined with unique security features that enable data exchange with suppliers while protecting everyone’s IP, Makersite technology allows teams to analyze how products are made and how to improve performance across the full value chain and life-cycle. They can get immediate answers about the product, people, planet, and profit – and the connections between them.
For more information about how Makersite works and how it can help with product development, please visit makersite.io
Contact us to get your BOM analyzed.
 Product development in the automotive industry: crucial success drivers for technological innovations’, Int. J. Technology Marketing, Vol. 3, No. 3, pp.203–222