Align Computer Simulation Strategies with Problem and Model
Mathematical models can be used to confirm through comparison to real-world data
|Figure 1 – JMP Prediction Profiler and Simulator|
Computer simulations begin with the statement of a problem, typically involving the description of the state of a system. A mathematical model is needed to describe the system, which may be based on theory (deductive) or observations (inductive). This can be constructed using modeling software (e.g. Fit Model platform in JMP software) and saved as a prediction formula.
A simulation strategy should be selected that aligns with the problem and model. The two most common approaches are deterministic or stochastic. In a deterministic simulation, the model parameters uniquely define the system states such that they perform in the same manner starting from the same initial settings. In a stochastic simulation, there is an element of randomness that can be added using various probability distributions, such as Normal, LogNormal, Exponential, Weibull, Binomial, Poisson, etcetera.
|Figure 2 – Capability analysis of simulation|
Monte Carlo methods are types of algorithms that use repeated random sampling and are often used to supplement theory. They were developed by John von Neumann, Stanislaw Ulam and Nicholas Metropolis during the Manhattan Project. The name derives from the fact that the uncle of Stanislaw Ulam spent much of his time and money at the famous Monte Carlo casino, which one would hope would have a large degree of randomness associated with its games of chance.The use of graphical output, such as the Prediction Profiler and Simulator options available in JMP software, can be used to visualize the distribution of input variables compared to a pre-selected set of specification limits shown as lines. For the three independent variables (X1, X2, X3), a normal distribution centered with defined mean and standard deviation (SD) and random error is demonstrated in Figure 1.
For industrial applications, there can be a further refinement of using weighted, truncated or censored options. Weighted distributions are good at estimating rare events. Truncated distributions account for product that exceeds specifications and is discarded. Censored distributions fix values at a limit that might originally have exceeded that limit but can be re-worked to be at a limit. The generation of probability distributions relies on the use of random or pseudo-random numbers. Pseudo-random numbers need to have certain characteristics to be used in simulations, such as long periods before a sequence repeats and fitting a distribution accurately. Due to issues related to the speed of convergence, low-discrepancy sequences often can be used in place of either random (or pseudo-random) sampling, since they converge more quickly and can provide more even coverage of a distribution. A random seed value used as the starting point for the simulation is selected by the software. But, if it can be specified, this allows the simulation results to be reproducible. A specified number of runs can be generated and a capability analysis performed (Figure 2).
|Figure 3 – Defect Profiler|
It can be seen that, from the distribution of the Y1 dependent variable, the process needs to be centered and the capability would be acceptable (> 1.0) with the proportion of out-of-specification occurrences falling below one percent. However, the distribution of the Y2 dependent variable is fairly well-centered, but shows too much variability that needs to be reduced in order to perform acceptably. The defect rate as a function of each of the independent variables on each of the dependent variables can be evaluated using a tool such as the Defect Profiler. This allows the determination of which variable the process is most sensitive to in order to pursue improvement opportunities (Figure 3).
|Figure 4 – Gaussian process model of log10 defect rate|
A simulation experiment of a portion of the factor space can be done on some or all of the factors, and a Gaussian process model of the overall defect rate calculated. The overall defect rate can then be minimized as a function of each of the variables, such that the optimal values of each variable can be calculated and used as the basis of future experimentation (Figure 4).Verification of these simulations needs to be performed either through an experimental design that covers the design space, or at least at critical points. This will strengthen the validity of the model or, if it does not confirm, lead to creation of a newer, more representative model.
The use of computer simulation on a mathematical model based on either a theoretical or empirical basis can be used to evaluate and confirm an existing model through comparison to real-world data. It also can provide information toward the creation of a newer, more robust model.Mark Anawis is a Principal Scientist and ASQ Six Sigma Black Belt at Abbott. He may be reached at editor@ScientificComputing.com.