LS-OPT is a stand­alone De­sign Op­ti­miza­tion and Prob­a­bilis­tic Analy­sis pack­age with an in­ter­face to LS-DY­NA.

In the “con­ven­tion­al de­sign” ap­proach, a de­sign is im­proved by eval­u­at­ing its “re­sponse” and mak­ing de­sign changes based on ex­pe­ri­ence or in­tu­ition. This ap­proach does not al­ways lead to the de­sired re­sult, that of a ‘best’ de­sign, since the de­sign ob­jec­tives are of­ten in con­flict. It is there­fore not al­ways clear how to change the de­sign to achieve the best com­pro­mise of these ob­jec­tives. A sys­tem­at­ic ap­proach can be ob­tained by us­ing an in­verse process of first spec­i­fy­ing the cri­te­ria and then com­put­ing the ‘best’ de­sign ac­cord­ing to a for­mu­la­tion. The im­prove­ment pro­ce­dure that in­cor­po­rates de­sign cri­te­ria in­to a math­e­mat­i­cal frame­work is re­ferred to as De­sign Op­ti­miza­tion This pro­ce­dure is of­ten it­er­a­tive in na­ture and there­fore re­quires mul­ti­ple sim­u­la­tions.

No two prod­ucts of the same de­sign will be iden­ti­cal in per­for­mance, nor will a prod­uct per­form ex­act­ly as de­signed or an­a­lyzed. A de­sign is typ­i­cal­ly sub­ject­ed to Struc­tur­al vari­a­tion and En­vi­ron­men­tal vari­a­tion in­put vari­a­tions that cause a vari­a­tion in its re­sponse that may lead to un­de­sir­able be­hav­ior or fail­ure. In this case a Prob­a­bilis­tic Analy­sis, us­ing mul­ti­ple sim­u­la­tions, is re­quired to as­sess the ef­fect of the in­put vari­a­tion on the re­sponse vari­a­tion and to de­ter­mine the prob­a­bil­i­ty of fail­ure.

To run and con­trol mul­ti­ple analy­ses si­mul­ta­ne­ous­ly, LS-OPT pro­vides a sim­u­la­tion en­vi­ron­ment that al­lows dis­tri­b­u­tion of sim­u­la­tion jobs across mul­ti­ple proces­sors or net­worked com­put­ers. Each job run­ning in par­al­lel con­sists of the sim­u­la­tion, da­ta ex­trac­tion and disk cleanup. Mea­sure­ments of time re­main­ing or per­for­mance cri­te­ria such as ve­loc­i­ty or en­er­gy are used to mea­sure job progress for LS-DY­NA’s ex­plic­it dy­nam­ic analy­sis cal­cu­la­tions.

The graph­i­cal pre­proces­sor LS-OP­Tui fa­cil­i­tates de­f­i­n­i­tion of the de­sign in­put and the cre­ation of a com­mand file while the post­proces­sor pro­vides out­put such as ap­prox­i­ma­tion ac­cu­ra­cy, op­ti­miza­tion con­ver­gence, trade­off curves, anthill plots and the rel­a­tive im­por­tance of de­sign vari­ables. The post­proces­sor al­so links to LS-Pre­Post to al­low the view­ing of the mod­el rep­re­sent­ing a cho­sen sim­u­la­tion point.

Typ­i­cal ap­pli­ca­tions of LS-OPT in­clude:

Fu­ture ver­sions of LS-OPT will com­bine op­ti­miza­tion and prob­a­bilis­tic analy­sis fea­tures in Re­li­a­bil­i­ty-Based De­sign Op­ti­miza­tion.



The Op­ti­miza­tion ca­pa­bil­i­ty in LS-OPT is based on Re­sponse Sur­face Method­ol­o­gy and De­sign of Ex­per­i­ments. The D-Op­ti­mal­i­ty Cri­te­ri­on is used for the ef­fec­tive dis­tri­b­u­tion of sam­pling points for ef­fec­tive gen­er­al­iza­tion of the de­sign re­sponse. A Suc­ces­sive Re­sponse Sur­face Method al­lows con­ver­gence of the de­sign re­sponse. Neur­al Net­works pro­vide an up­date­able glob­al ap­prox­i­ma­tion that is grad­u­al­ly built up and re­fined lo­cal­ly dur­ing the it­er­a­tive process. A Space Fill­ing sam­pling scheme is used to up­date the sam­pling set by max­i­miz­ing the min­i­mum dis­tances amongst new and ex­ist­ing sam­pling points.

LS-OPT al­lows the com­bi­na­tion of mul­ti­ple dis­ci­plines and/­or cas­es for the im­prove­ment of a unique de­sign. Mul­ti­ple cri­te­ria can be spec­i­fied and analy­sis re­sults can be com­bined ar­bi­trar­i­ly us­ing C or FOR­TRAN type math­e­mat­i­cal ex­pres­sions.

Response Surface Methodology

Re­sponse sur­face method­ol­o­gy (RSM) is a col­lec­tion of sta­tis­ti­cal and math­e­mat­i­cal tech­niques use­ful for de­vel­op­ing, im­prov­ing and op­ti­miz­ing the de­sign process. RSM en­com­pass­es a point se­lec­tion method (al­so re­ferred to as De­sign of Ex­per­i­ments, Ap­prox­i­ma­tion meth­ods and De­sign Op­ti­miza­tion to de­ter­mine op­ti­mal set­tings of the de­sign di­men­sions. RSM has im­por­tant ap­pli­ca­tions in the de­sign, de­vel­op­ment, and for­mu­la­tion of new prod­ucts, as well as in the im­prove­ment of ex­ist­ing prod­uct de­signs.

In LS-OPT, Re­sponse Sur­face Method­ol­o­gy is used both in Op­ti­miza­tion and Prob­a­bilis­tic Analy­sis as a means to re­duce the num­ber of sim­u­la­tions. In the lat­ter pro­ce­dure, RSM is al­so used to dis­tin­guish de­ter­min­is­tic ef­fects from ran­dom ef­fects.

Probabilistic Analysis

LS-OPT en­ables the in­ves­ti­ga­tion of sto­chas­tic ef­fects us­ing Monte Car­lo sim­u­la­tion in­volv­ing ei­ther di­rect FE Analy­sis or analy­sis of sur­ro­gate mod­els such as Re­sponse Sur­face Method­ol­o­gy or neur­al net­works. As an in­put dis­tri­b­u­tion, any of a se­ries of sta­tis­ti­cal dis­tri­b­u­tions such as Nor­mal, Uni­form, Be­ta, Weibull or User-de­fined can be spec­i­fied. Latin Hy­per­cube sam­pling pro­vides an ef­fi­cient way of im­ple­ment­ing the in­put dis­tri­b­u­tion. His­tograms and in­flu­ence plots are avail­able through the post­proces­sor (Ver­sion 2.2).

Instability/Noise/Outlier Investigations (Version 2.2)

Some struc­tur­al prob­lems may not be well-be­haved i.e. a small change in an in­put pa­ra­me­ter may cause a large change in re­sults.

LS-OPT com­putes var­i­ous sta­tis­tics of the dis­place­ment and his­to­ry da­ta for view­ing in the LS-DY­NA FE mod­el post­proces­sor (LS-Pre­Post). The method­ol­o­gy dif­fer­en­ti­ates be­tween changes in re­sults due to de­sign vari­able changes and those due to struc­tur­al in­sta­bil­i­ties (buck­ling) and nu­mer­i­cal in­sta­bil­i­ties (lack of con­ver­gence or round-off). View­ing these re­sults in LS-Pre­Post al­lows the en­gi­neer to pin­point the source of in­sta­bil­i­ty for any cho­sen re­sponse and there­fore to ad­dress in­sta­bil­i­ties which ad­verse­ly af­fect pre­dictabil­i­ty of the re­sults.


A trade­off study en­ables the de­sign­er to in­ter­ac­tive­ly study the ef­fect of changes in the de­sign con­straints on the op­ti­mum de­sign. E.g. the safe­ty fac­tor for max­i­mum stress in a beam is changed and the de­sign­er wants to know how this change af­fects the op­ti­mal thick­ness and dis­place­ment of the beam.

Variable Screening

For each re­sponse, the rel­a­tive im­por­tance of all vari­ables can be viewed on a bar chart to­geth­er with their con­fi­dence in­ter­vals. This fea­ture en­ables the user to iden­ti­fy vari­ables of less­er im­por­tance that can be re­moved from the op­ti­miza­tion, there­by con­tribut­ing to time sav­ing while hav­ing lit­tle ef­fect on the fi­nal re­sult.


De­sign of Ex­per­i­ments: A point se­lec­tion method for de­ter­min­ing the num­ber and lo­ca­tions of sam­pling points in the De­sign Space. A sim­u­la­tion is done at each sam­pling point.

Ap­prox­i­ma­tion: A sim­ple math­e­mat­i­cal func­tion act­ing as a sub­sti­tute (or sur­ro­gate mod­el) to gen­er­al­ize the (of­ten high­ly com­plex) Re­sponse vari­a­tion across the De­sign Space.

Re­sponse: The re­sult ob­tained from an analy­sis (e.g. Fi­nite El­e­ment Analy­sis) of a prod­uct or process. The re­sponse is used as a cri­te­ri­on in De­sign Op­ti­miza­tion or Prob­a­bilis­tic Analy­sis.

De­sign Op­ti­miza­tion: The process of set­ting the de­sign vari­ables, typ­i­cal­ly the di­men­sions, of a prod­uct to min­i­mize or max­i­mize the val­ue of its Re­sponse. A more gen­er­al form of op­ti­miza­tion in­cludes spec­i­fied lim­its on oth­er re­spons­es (con­strained op­ti­miza­tion).

Prob­a­bilis­tic Analy­sis: The analy­sis of a set of dif­fer­ent de­signs with a spec­i­fied dis­tri­b­u­tion in or­der to de­ter­mine the char­ac­ter­is­tics (such as the mean and stan­dard de­vi­a­tion) of the Re­sponse dis­tri­b­u­tion.

De­sign Space: The re­gion be­tween the low­er and up­per lim­it for each of the de­sign vari­ables. These are spec­i­fied to pre­vent the oc­cur­rence of de­signs with ex­treme of non­sen­si­cal di­men­sions (such as neg­a­tive thick­ness­es).

Re­gion of in­ter­est: A part of the De­sign Space con­sid­ered be­ing of in­ter­est for de­sign ex­plo­ration or De­sign Op­ti­miza­tion.

De­sign Vari­able: An in­de­pen­dent vari­able or di­men­sion which forms part of the de­scrip­tion of a de­sign. Typ­i­cal de­sign vari­ables are thick­ness di­men­sions, geo­met­ri­cal di­men­sions or val­ues of ma­te­r­i­al con­stants.

D-Op­ti­mal­i­ty Cri­te­ri­on: A cri­te­ri­on that de­ter­mines how well the co­ef­fi­cients of the de­sign Ap­prox­i­ma­tion are es­ti­mat­ed. The changes in the lo­ca­tions of the sam­pling points to max­i­mize this cri­te­ri­on max­i­mizes the con­fi­dence in the co­ef­fi­cients of the Ap­prox­i­ma­tion mod­el.

Ro­bust: A ro­bust prod­uct per­forms con­sis­tent­ly on tar­get and is rel­a­tive­ly in­sen­si­tive to pa­ra­me­ters that are dif­fi­cult to con­trol. A ro­bust de­sign min­i­mizes the noise trans­mit­ted by the noise vari­ables.

Noise vari­able: A pa­ra­me­ter of a prod­uct that has some de­gree of un­con­trol­la­bil­i­ty while the prod­uct is be­ing man­u­fac­tured or used in the field up to the end of its life­time.

Re­sponse Noise: The ran­dom com­po­nent of a re­sponse vari­a­tion that can be caused by in­sta­bil­i­ty of the struc­ture (such as buck­ling), nu­mer­i­cal round­off dur­ing analy­sis or mod­el­ing ef­fects such as Fi­nite El­e­ment mesh­ing or lack of con­ver­gence dur­ing analy­sis.

Suc­ces­sive Re­sponse Sur­face Method: The suc­ces­sive re­sponse sur­face method is an it­er­a­tive method which con­sists of a scheme to as­sure the con­ver­gence of an op­ti­miza­tion process. The scheme de­ter­mines the lo­ca­tion and size of each suc­ces­sive Re­gion of in­ter­est in the De­sign Space, builds a re­sponse sur­face in this re­gion, con­ducts an De­sign Op­ti­miza­tion and will check the tol­er­ances on the Re­spons­es and de­sign vari­ables for ter­mi­na­tion. When us­ing neur­al net­works in­stead of poly­no­mi­als as a Sur­ro­gate mod­el, the Ap­prox­i­ma­tion is up­dat­ed in­stead of new­ly con­struct­ed in each it­er­a­tion. Con­se­quent­ly, the fi­nal ap­prox­i­ma­tion has a glob­al rep­re­sen­ta­tion that can be used for op­ti­miza­tion, trade­off stud­ies or prob­a­bilis­tic analy­sis.

Struc­tur­al vari­a­tion: Vari­a­tion in the di­men­sions or ma­te­r­i­al prop­er­ties of a prod­uct.

En­vi­ron­men­tal vari­a­tion: Vari­a­tion in the loads such as force (per­haps due to im­pact) and tem­per­a­ture con­sid­ered in the de­sign of a prod­uct.

Sys­tem Iden­ti­fi­ca­tion: The de­ter­mi­na­tion of sys­tem pa­ra­me­ters such as ma­te­r­i­al con­stants to min­i­mize the dif­fer­ence be­tween com­pu­ta­tion­al re­spons­es and ex­per­i­men­tal re­sults. The pur­pose is to iden­ti­fy the sys­tem pa­ra­me­ters of a mod­el by us­ing ex­per­i­men­tal re­sults of a phys­i­cal ex­per­i­ment.