This Website is not fully compatible with Internet Explorer.
For a more complete and secure browsing experience please consider using Microsoft Edge, Firefox, or Chrome

Q&A Session

Q&A Session Questions

 

1. What would be the first action organisations should take when considering a strategy for Engineering Simulation?

  • First step is identifying and gathering your Stakeholders and agreeing your Goals for Simulation.
  • Next step is to identify your detail requirements. i.e. What is the ‘to-do’ list you need Simulation to be able to address.

2. What is the most important action organisations should take to improve their Engineering Simulation Capability?

  • Having identified your goals and requirements the most important action is to identify your current position regarding your processes, methods, tools, models etc…
  • Where are the gaps?

3. What are some of the common issues that organisations face with their Engineering Simulation?

  • Inefficient slow processes.
  • Lack of consistent measurement of confidence across the organisation
  • Poor data management causing inefficiency and concern over quality.

4. One of the challenges we have is the alignment of expectation of simulation credibility and definition of accuracy. Could you put some words into it based on your experience?

  • Yes this is really important. Of course it goes back to you product and requirements. How accurate does your modelling result need to be? What correlation data do you have to verify your models and methods? Do you have a Verification and Validation process in place?
  • There are several established industry metrics available that can be used to quantify confidence. There are several publications, references and courses on this topic available via the NAFEMS website.

Is there a particular sector that's taken a lead in implementation? Aerospace/Automotive/Chemical/Nuclear etc - if so, what do you think is the differentiator?

  • The credibility metrics have been driven most by those industry sectors where it is most difficult, or impossible, to conduct representative physical testing. Hence particularly Aerospace (esp. Space), and Nuclear. In fact, there is a great NAFEMS World Congress paper, presented by J-F Imbert and W Oberkampf, that reviews a range of the tools available for this purpose. This can be found at:https://www.nafems.org/publications/resource_center/nwc17_683/

5. I'm a structural analyst in a crane and lifting vehicles manufacturer. Its a medium scale industry and in the technical office (engineering dept) we are a total of 10 engineers. SInce we are small im the only dedicated analyst along with three other mech engineers covering other roles. We do have a specialist software and have simulation strategy/verification approach/validation techniques, mostly developed and put in procedure by one person (me) & reviewed together.

We depend highly on the simulation results for our product development and release. This puts high risk responsibility on one person.

Moving on from this is it better to propose recruitment of another junior engineer? or better to upscale my simulation strategy knowledge?

  • I would reassure you that although your team is small scale the principles of what is necessary for and Engineering Simulation Capability still apply. I would recommend reviewing your Goals and Requirements (What do you need to be able to do?), and then encourage you (since you will know best) to think about where the gaps are. Is it capability for a particular aspect (e.g. lack of a tool, or lack of data to correlate your methods), or is it capacity (e.g. Do you need more people to do simulation, or support to make your process more efficient)?

Other tips regarding this.

  • It might be helpful to explain your thinking and try to get support from a leader in the organisation, who understands the need and can help you with what needs to be done?

6. Could you elaborate on collabration between teams building simulation models of varying fidelity/complexity? CFD/FEA for design and RealTime modeling for Hardware-In-Loop.

  • Of course it depends on the detail, but may be entirely appropriate.
  • Are the teams addressing different (physics) requirements? Is the CFD model also required for other purposes?
  • A high fidelity CFD FEA model may be necessary to represent the detail and achieve the necessary accuracy to address a specific performance requirement. But this detail model is likely to be too complex and slow to run in real time, hence no good for a ‘Hardware in the Loop’ model. Hence, a surrogate model may be needed that can represent the performance of the system but in a simple model that is fast enough to be run in real time.
  • Is the CFD model in fact used to generate the data necessary to create the surrogate model?
  • The models could even be representing the same physics but at different levels of fidelity and speed.
  • In any case there should be a strong connection between the two models (and hence teams) as the output from either might influence the design, and both need to keep up with changes.
  • It may also be that some process automation linking to the two processes could help efficiency.

7. When is the next offering of the M&&S strategy course?

Does this course include practical case studies?

  • There is a lot of content to cover so the course doesn’t currently go into specific case studies in detail, but as much reference is made to practical application as possible throughout the course.

Do you have free plan for student interested by these kind of courses?

  • You would need NAFEMS to address this directly

8. Based on the trajectory of developments in the simulation industry, do you foresee the ablility to eliminate testing of physical prototypes in the near future? How soon will that be and do you know of any work on this?

  • This is very sector, product and organisation specific. Many industries have already been able to reduce dependency on test significantly, indeed some have been able to eliminate some testing for specific purposes. It depends on the accuracy of the physics tools, the level of correlation that has been possible, and the reliability of the organisations processes, methods, model and data management and experience and skills of its people.
  • If organisations have mature modelling capability and have existing data that demonstrates that the models have consistently accurately predicted the performance, or failure of their products, then they may have achieved the necessary confidence to reduce their reliance on physical testing.
  • This doesn’t happen by chance through. Specific effort needs to be put in place to ensure; reliability in processes, accuracy in methods, representative models, mature data, skilled teams, capable computing resources.

9. Is there a way to force our VPs and Directors to watch this webinar at 3X speed?

  • Not sure I can help with that exactly but I would suggest it is worth highlighting the value and contribution that Simulation is already, or could be, making to your organisation and the extra value that could be achieved by giving it the appropriate focus and support.
  • Perhaps highlight some key issues that might be holding you back.
  • The other suggestion is to select a handful of the messages from the webinar that you think are most relevant to your organisation and share those.