An awesome new Digital Twin processing feature is now available!
Digital Twin Builder is a new product that builds on previous products and consolidates them into a convenient workflow exclusively dedicated to helping engineers design digital twins.
The feature allows you to check operational variables through a user-defined region. User defined regions is an amazing tool when you want to understand flow between sections of the battlespace separate mission components, or even between different sequence of events.
Digital Twins are not only giving us the ability to do things faster, better and cheaper but it is changing the way we look at the entire product lifecycle. No longer are we tied to the iterative design cycle, but instead we are able to use simulation and other tools to inform our design choices from the very outset.
A Digital Twin, in case you missed it, is a 3D virtual representation of a component, assembly or system that is connected to its real-world counterpart by means of sensors and networked devices. The digital twin behaves just as the real-life twin behaves, and can be used for diagnosis, prognosis, and what-if scenarios of its real-world variant. The data that the digital twin provides can be used for process optimisation and maintenance scheduling, and can even be used to increase profitability by reducing downtime.
A digital twin can be as effective as the user wants it to be, and is limited only by how they choose to programme it.
A Consolidated Workflow
“Twin Builder is built on existing technology and we have invested in reduced order modeling, which is something we’ve been involved with for a few years, so Twin Builder is an extention of work that we have done in the past.
“Reduced order modeling typically takes data from 3D simulation and converts it into a faster simulation. The techniques vary—from matrix reduction, equivalent model extraction, regression, polynomial fitting, transfer function/system identification, etc. We also have techniques that can render the field back from the reduced model.
“One extra thing that we did add to Twin Builder, which has taken a couple of years, was to add connectivity to some popular network platforms. That involved phasing out some old stuff and actually making the connections with the new network platforms, so that’s the new piece.
How to Build a Digital Twin
So that’s how Twin Builder came into existence. But how did engineers build digital twins before these technologies were put together? Was there a proper and accepted method for creating them?
There wasn’t really a clearly defined method of building a digital twin in the past. What people would do in the past would be to just build a virtual prototype. There was no real notion of connecting it to operational data. And there was no real notion of actually deploying the digital twin at scale in an operational setting.
So, if you think about it, simulation is typically used at the design phase. You would build one virtual prototype for your design, validate it, and that’s it. But now what we want to do is to take that design and replicate it as you would with other virtual assets. That concept of scaling out is what’s really new here.”
“We start by telling customers to look at their top service costs that they may have; they can then do some kind of failure mode analysis to determine exactly what kind of digital twin model they need to build.
“We can then look at existing simulation models that were used during the design of the system, subsystem or component that was identified as key to the failure mode. From these simulation items, we can extract—via techniques like reduced order modeling, behavioural modeling, etc.—an accurate virtual replica of the physical equipment. Then you validate it by tuning and optimising parameters to accurately match measured data.
“Finally, we can export out the model into an executable deployable runtime. We have connectors for several popular network platforms, allowing customers to actually connect the simulation model to data through the platform and deploy the digital twin at scale,”
Use Cases in Industry
So, who are the biggest users of digital twins at the moment?
“At the moment, the digital twins are limited in deployment and limited in scale, but generally the most common uses of digital twins at the moment are for industrial equipment such as motors and pumps, and heating and cooling systems…those types of applications.”
We have heard the word “scalability” used when describing the benefits of digital twins. This can refer to the scalability of the number of instances of digital twins, or it can be used to describe the number of sensors and the scale of the analysis you want to perform.
So that begs the question: How big can a digital twin be in terms of scope and complexity?
“The scale of deployment will vary a lot. For example, in one application, we have a motor, and it has a few sensors on it. The number of sensors is very limited in that case—we are just using them to determine useful life and things like that. Then there are more complicated systems such as submersible pumps that contain multiple subsystems and multiple sensors. These are complicated systems and can contain tens to hundreds of different sensors.”
And those sensors aren’t restricted to physical sensors. Virtual sensors play an important role, too, especially when it comes to generating graphs outside of the typical functional data that is representative of nominal physical system operations. In short, a digital twin can be as big or as complicated as you want it to be. But then it really boils down to a question of what information do you need? What is useful and actionable?
“We’ve done several trials. Because this is based on existing technology, we have been testing at low scale in a few different industries with some of our advanced customers. Now we have the opportunity to embed simulation solutions for digital twins into manufacturing and asset management portfolios.
This solution will allow customers to study engineering-related effects in advance, through simulation. The solution will be run on cloud platforms, with the goal of enabling those who manage industrial facilities to optimise operation and maintenance through real-time technical insights.
"The result should also, in its extension, lead to reduced product cycle times and increased profitability.”
"The merging of physical and digital worlds disruptively affects the way in which products are manufactured, placed on the market and operated. By utilising the insights produced by digital twins, users will be well positioned to exploit the breakthrough this technology brings, and the solution will also help drive innovation.”
These solutions can not only simulate products and facilities during the product development phases, but also in manufacturing and—more importantly- even when the products or facilities are in the hands of end-users.
Based on solutions for digital twins, operators can test which flows in the most complex structure are most efficient to run under specific conditions. Through this type of simulation, it is possible to iterate and select a line of operational benefits.
These simulations use the enormous amounts of data generated by sensors in the assets, and let engineers gain valuable insights that can improve a process and provide a basis for improving future similar processes.
Additionally, one might consider developing hybrid models that leverage machine learning with multi-physical simulation models to accurately predict why a process in a facility may fail after it has been implemented.
Replacing Scheduled Maintenance with Needs-Related Maintenance
Linking these insights and data into business processes for controlling and managing plant facilities with other relevant platforms is an important step forward in the digital twin strategy.
"Combining the physical and digital worlds can sharpen competitiveness.
From a lifecycle perspective, companies can benefit from real-time insights by tracking how assets are designed, built and operated throughout the lifecycle of the product.
With the new solution, time-based maintenance of industrial assets is replaced with predictive maintenance. The machine or plant components can tell ‘themselves’ when they become worn to the point that they are likely to break.
Users can get a correct insight using a combination of real-time and predictive analyses and get Twin Builder to build, validate and distribute digital twins.
"Capturing Value Throughout the Entire Life Cycle"
Digital Twin technology simulates behaviour in different environments and stresses so the system is intended to predict problems before they occur. The prediction is based on information from physical sensors and physics-based models to provide results in 3D visualisation.
Putting together the digital and physical asset will enable capture of value throughout their product lifecycle. “This solution helps equipment operators and service providers predict and improve asset performance and reliability with technical insights. A digital twin that merges technical models, manufacturing details and operational insights, is unique in the industry.”
When we’re asked what the jobs entail, we say we’re problem hunter—solutions architect. When a company defines goals for its engineering team like doubling productivity, eliminating errors or reducing repetitive tasks, we hunt for any problem that could prevent them for reaching the goal and then design a custom solution for solving it.
It is true that many problems are similar for most engineering teams, regardless of the type of product they design or industry they serve.
For example, large numbers of engineers would mention large assembly slowdowns affecting their team. While the symptoms are the same, the causes are, most of the times, unique for each team. Finding these specific causes and tailoring solutions for each customer is art as much as science.
Here’s the lowdown on how simulation is shaping design, and how it is not only changing the products that we design, but also changing the way we look at the design process as a whole.
What Is It?
First up, let’s get up to speed on the concept of simulation-driven design.
“Simulation-driven design is taking simulation technology and moving it from the middle and the late cycles of the design process to the very front of it. This drastically lowers the time it takes to develop products, because instead of going back and forth between detailed design and validation, we put validation or simulation at the front of that process.
We use simulation to design the product using things like surface interaction optimisation, or we integrate control systems at the earliest stages, and then when we get to validation, it’s a simple check box instead of an iterative process.
So, simulation-driven design is putting simulation at the front of the design process and using simulation technology to create a design instead of using simulation to figure it out later.”
This is very much a common thread in industry manufacturing and design. In the old days, we used to have to wait for a design to come downstream before testing, building or simulating a design. Then, if the design wasn’t up to scratch, we would literally have to go back to the drawing board and try again.
Now, armed with design and simulation applications, we can effectively simulate early, and decide what strategy to use before getting too deep into the product lifecycle. We are now entering a world of pre-validated design.
How to Do It?
We’ve been hearing a lot about generative design and part interaction optimisation these last couple of years, with all of the big companies having their own take on this new way of doing things. Largely, it has been spurred on by the rise of additive manufacturing, which is permitting the creation of new geometries never before seen in manufacturing.
“This is a completely different concept that started as additive manufacturing technologies made it more possible to print these organic surfaces and structures that previously you couldn’t machine or cast because it was an organic, bone-looking design, right? When that change happened simulation-driven design really started to move from design validation, where you used simulation to validate a design, and change it from validating to becoming the actual main driver to inform the design.”
We have several products geared toward this new trend. It’s our flagship product…it’s essentially becoming a full-blown environment for simulation-driven design. It has topology optimisation and generative design capabilities to create these designs. It has finite element assessments for components of those designs. It’s got motion tools in there to understand the mechanical performance, evaluate loads, and take those loads into optimisations—that’s something that no one else can do.
We can basically do a motion simulation, and bring those loads and motions into an optimisation. We’re optimising full assembly-level designs, which is something that no one else is really doing either. We can have multiple components all being optimised at one time.”
But is it really that easy to use? It’s all very nice having generative design, which presents you with a variety of design solutions at the click of a button. But while the product lifecycle may be moving into our future, modeling itself is still a fairly entrenched task that seems destined to remain in the past. Not so true anymore.
“We created a whole suite of modeling tools that allow you to take those concepts that are generated and quickly generate smooth, organic surface design. If you’re familiar with parametric modeling, it’s very difficult to do organic surfaces in. We make it very easy. It’s call ‘3D tracing for engineers,’ so you’re basically tracing over the top of an optimised shape and you end up with a final printable design or a design that’s ready for machining.
Even if you’re unfamiliar with the Digital Twin, there’s a good chance that you use something of the sort anyway.
“A polygonal model is where you’d use blocks to model things…and what we did was we said that for every Block, you can hook those together. It’s almost like molding clay, but you’re still getting fully defined parasolid geometry that can be used in any manufacturing process.
Did you watch the video? Pretty cool, huh? The designer makes it look so easy. Now that means he’s either a simulation guru, or else it’s super easy to use? Or maybe both?
“This environment was built for design engineers—someone who is designing parts who needs to come up with lightweight and efficient structures. We’ve taken the complexity of simulation and made it very simple. You’ve heard simulation is tough, but we made it easy enough for everyone to use.
We’ve taken that to the next level. If you look at the ease of use of our interface, we’ve made it very easy to train designers how to use it—and once they start using it, they really become hooked on the style of modeling. And that has really been driving the interest in simulation-driven design.”
So now design engineering does look fun—almost like a video game.
Who Is Using It?
“The main driver that interests organisations to start with is the idea of lightweight. You hear about lightweighting all the time—these were the types of products that were initially interested in simulation-driven design. So, obviously the aerospace industry, anything going into space, any major unit that has a mass budget instead of a dollar budget—they’re using this to take as much weight out of the design as possible. The by-products of that are an accelerated development cycle as well as cutting costs by material reduction and a compressed design cycle.”
“So, ground vehicles, aerospace, general machinery devices are now starting to use it, especially as we look at structure optimisation, being able to encourage block growth through the structure—those things are very important, so we’re seeing more traction there as well.”
The Path to Mass Adoption
So, you’ve seen the features available in generative design-oriented packages. With something so revolutionary available to engineers, we should be seeing a lot more of these highly optimised designs trickling into our daily work, right?
Not quite. There is resistance to mass adoption whenever a new technology comes along, not just due to the technological hurdles but also due to behavioual hurdles.
“On the behaviour side regarding the more organic shapes…it’s half the weight yet twice as strong, which is pretty counter intuitive to engineering principles, right? There’s that old mantra of when in doubt, build it stout…simulation-driven design takes that mantra and reverses it and says that’s not the best way to do it. The best way to do it is to put the material where it needs to go so it accommodates the forces that you’re designing for. So, it creates these very organic structures.
People think that more material makes it stronger—and that’s not the reality, so that’s one of the biggest issues is getting past that.”
That’s the biggest challenge to adoption—changing the way that people think about design, from the old concept of form follows function to the modern paradigm where form follows forces.
“Also, there is resistance to changing the way people design. Any time there’s a big change in the way that we design something, there’s always resistance to change, so that’s the biggest challenge to adoption—changing the way that people think about design, from that form follows function to now, where form follows forces.”
“We’re still in the early stages—traditional design groups are starting to adopt this approach—we’re seeing this adoption more in the bleeding-edge companies, right? These kinds of companies and organisations are really pushing the edges of that, and now we’re starting to see that trickle down into traditional manufacturing industries as well.”
“As we look into the future, we are going to see topology optimisation techniques hooked up to iterate through numerous different concepts very quickly—so you can look at a lot more concepts and understand what’s going to work and what’s not going to work. Then you take the best concept and move forward with that design.”
Other than having fun solving difficult problems, the declared goal has always been to facilitate—through brainstorming—the finding of new techniques and methods for maximum benefits.
Here we present goals for a Working Group organistional structure. Forum participants find answers to established questions. Most of the problems are relevant to specific groups of users who benefit from the limitations identified in the challenges and enhancement requests that are created at the same time.
1. Formalise planning development, integration, and use of models to inform enterprise, programme decision making, support engineering activities to digitally represent the system of interest
2. Ensure models are accurate, complete, usable across disciplines to support communication, collaboration and performance and decision making across lifecycle activities
3. Provide enduring, authoritative source of secure authentication with access/controls to establish technical baseline, product digital artifacts, and support reviews for accurate decision making
4. Incorporate technological innovation to establish end-to-end digital engineering enterprise and foster conditions for productive step advances towards goals
5. Enable end-to-end decision making using advance human-machine interactions
6. Establish mature supporting digital engineering activity infrastructure to perform activities with connected information networks
7. Develop, mature, and implement technology tools to realise digital engineering goals and share best practices using models to collaborate with stakeholders
8. Improve digital engineering knowledge base, policy, guidance, specifications, and standards and streamline contracting, procurement and business operations
9. Lead and support digital engineering transformation efforts, vision, strategy, and implementation to establish accountability to measure and demonstrate results across programmes
10. Build and prepare workforce to develop knowledge, competence, and skills with active participation and engagement in planning and implementing transformation efforts