Using Revamped @RISK for Six Sigma Data and Risk Analysis

Add bookmark

(Editor's note: Six Sigma & Process Excellence IQ searched its contacts on LinkedIn for someone who would be able to offer insight into @RISK 5.5.1 and selected Kimball Fink-Jensen to review the software. Fink-Jensen is in no way associated with Palisade Corporation.)

It is common for Lean and Six Sigma practitioners to put much emphasis into the tools and techniques of Lean and into their DMAIC analysis. Somewhat ironically, it is less common to really drill properly into the question of whether the models and analysis actually represent the underlying variation that exists within the data under observation. Too often, models are created as static representations of often large data streams, rather than reflecting the dynamic variation within the process.

Fortunately there is an excellent tool to help drill further down into the true meaning of the data, and the recent minor upgrade provides an excuse to discuss that tool: @RISK from Palisade Corporation.

@RISK Background

@RISK is a risk analysis tool that is particularly easy to use. It was first developed as a standalone package in 1984, but since 1987 it has existed in the form of a highly integrated add-in to spreadsheets (first Lotus 1-2-3, then Microsoft Excel). It brought the power of Monte Carlo analysis within reach of anyone who could use a spreadsheet. The current product range bears little resemblance to the one I first encountered in 1995, with its extremely fast calculation times, parallel use of multiple processors, dynamic output charts, regression scatter plots, shared function libraries, extended product range and so on.

The Six Sigma capability was first introduced into @RISK with version 5.0 in 2008. @RISK and Six Sigma are excellent bedfellows, providing the means to dynamically analyze both starting data and the ongoing improvements being made to the process so as to provide even more insight to the true causes of variability.

The 2009 upgrade, @RISK 5.5, provided a host of new capabilities: new functions, graphs, interface improvements, simulations based on data from previous limitations, and so on. However, the most significant improvements were better ease of use functionality and increased speed of simulations. These adjustments enabled the reduction of the number of steps in a number of the user interfaces and created anywhere from a 100 percent to 2,000 percent improvement in the speed-to-complete simulations. This was based primarily on a complete re-write of the computational engine within @RISK, but the advent of multiple processors and the ability of Excel 2007 onwards to simultaneously use those processors also has contributed to further increases in speed in recent times.

@RISK 5.5.1

The recent release of @RISK 5.5.1 provides a minor update to @RISK 5.5. These additions include in-line handling of password protected sheets, a merged ribbon if RISKOptimizer is also in use (it comes as part of the Industrial version of the DecisionTools Suite that includes @RISK) and some additions to the VBA Controls.

In addition, if you use @RISK within the Decision Tools Suite, you will also now encounter multiple language options, the ability to open any of the other Suite applications from within @RISK, and it is now also easier to set up @RISK and the other Suite applications to open automatically each time Excel is loaded.

Using @RISK for a Six Sigma Analysis

Creating a Six Sigma analysis is very simple in @RISK. (Click on image to enlarge.)



First, the Input variables are defined. Typically these are design or process inputs. If process inputs, then it is likely you will have tables of actual data to go along with the specifications and tolerances of the design inputs. The actual data is assessed using the Fit Distribution tool in @RISK to determine what probability distribution provides the best explanation of the mean and variation in the data observed. Each of these distributions then becomes an Input to the @RISK model. (Click on images to enlarge.)





The process limits, target values and other Six Sigma properties are also added based on each measured process. (Click on image to enlarge.)



A RiskSixSigma output function is then added to the normal @RISK RiskOutput function to tell the simulator to calculate regular Six Sigma statistics such as process capability, defective parts per million and the process sigma level, as well as yield values, z-scores, process center measures and so on.

Finally, a simulation is run that cycles through many iterations of the model, replacing the previously static inputs with samples from the probability distributions described. The results are then able to be measured across each iteration and summary statistics produced. (Click on image to enlarge.)



As well as the Six Sigma statistics, the powerful Monte Carlo analysis posits the causal relationships between the variation in output resulting from iterating the inputs and identifies which factors have the greatest influence on the observed outcomes, providing a useful indicator of priority for targeting improvement initiatives. (Click on image to enlarge.)



Conclusion

If you are an @RISK user but are still using an earlier version such as @RISK 4.5, I strongly suggest you look at upgrading to the new @RISK 5.5.1 to access the Six Sigma features and to experience the substantial processing speed improvements. Upgrades to new versions of @RISK are free to customers who keep current their annual maintenance contract. However, customers who have let their maintenance contract lapse must pay an upgrade fee to get the new version of @RISK.


RECOMMENDED