Simulation vs. Iteration

Add bookmark
Dan Morris
Dan Morris
11/18/2015

Simulation – can you really do without it?

Today the BPM community is somewhat split about simulation vs rapid iteration. In the interest of full disclosure, for years I was a proponent of rapid iteration. I have now however, been somewhat enlightened by iBPM and the role that simulation can play in shortening the overall cycle of solution ideation through deployment. I have also found that the results of a design can be proven to be optimal before any type of new business design construction requirements or their accompanying application development requirements are considered. That reduces time, improves solution quality and cuts cost.
But many BPMS vendors continue to resist simulation claiming that rapid iteration negates the need for simulation. I disagree and while I continue to be a proponent of rapid iteration, I believe that it gains value when it is performed within the umbrella of design simulation.

So, can we live without simulation?

In the vast majority of companies the answer in the past has been yes we can – and they have. There is a growing need, however, to curb the unending business redesign iterations, project delays, project delivery elongation as Agile sprint after sprint is iterated, lost benefit from the project’s delayed implementation and the list goes on.

For many, that is BPM today. But, that was state of the art in the past. With iBPM that is changing.

First let me stress that BPM is a business transformation enabler and can be done with traditional IT or a BPMS tool suite. BPMS is a set of technology tools that creates a new technical operating environment that has pre-automated a lot of the application development process and provides a lot of capabilities that can be used to monitor and control the flow of work and measure it. This is a technical play in the project.

[inlinead]

The real advantage of BPM, however, is only realized when BPM activity is supported by a BPMS tool suite. Today this suite may or may not be from a single vendor. For example, some BPMS vendors use Visio as a modeling tool and export business models into their product suite as a starting point. Others have specialty tools that they interface with – such as simulation modelers and even some performance measurement tools for Six Sigma monitoring.

The Problem

The real problem is that with rapid iteration each iteration redesign is an educated guess, but it is still a guess. A design iteration may be printed, scribbled on paper, or drawn in a modeler, but it is a concept of a possible future operating model. As such the design is reviewed by the project team and the user – and in some companies by the extended project group that includes those who might be affected by the solution (collaborators). The design is discussed and improvements are considered and made. The improvements may or may not really improve the workflow or the evolution of the business to optimization. Once the changes are completed and the models have been updated, they are once again reviewed by everyone and changed again. This process continues until someone with authority decides the new design is good enough. At that point it is still unproven, but should work. That is the iteration concept.

The problem is that this process may take a long time and the design may go through a lot of iterations with no guarantee that it really improves efficiency or eliminates problems. But the design that was declared finished will now be built. That may be based on BPMS generation with external interfaces and changes to legacy applications or it may be the construction of the new applications using traditional programming languages.

Because the design efficiency is a guess at the time it is declared finished, it may or may not work well when it is built. That spawns another, now more expensive and longer, solution iteration process. But with enough time, the solution can be iterated enough to become acceptable and with more time it can be made optimal. That is obviously true, but it is disruptive to the business and it is a long costly process.

This actually nullifies the fundamental benefit of BPMS enabled BPM – speed. While each iteration is fast, the multiple iterations in the whole cycle of ideation through final solution deployment may not be fast. The typical response to this argument is that in BPM, once a business operation is redesigned, the updating and modification of the processes in the business unit never stops. That is a great concept, but given financial reality and the competition for BPM and BPMS resources, it is also not really true.

So how can this solution development process itself be improved?

I would like my readers to consider a new way of approaching this traditional business design iteration. If a simulation modeler is used to evaluate the efficiency of each iteration the game changes. The design evaluation splits into two parts. People need to determine if the design represents an improvement in effectiveness – doing only the things that must be done to deliver the product, component, or service. That is a person’s call and it should be done before the design is iterated and then as a test for each iteration. Clearly, any solution must be effective and thus each iteration needs to pass the effectiveness test.
So, if we assume that the iteration passes the effectiveness test, we now come back to efficiency. Any solution must be as efficient as possible and at a minimum, each design iteration must be more efficient than that last one.

But effectiveness, while not an easy determination, is easier than efficiency. You either need to do something or you don’t. That is a different consideration than what might make a given sequence or group of activity efficient. For that I recommend that a formal simulation tool be used to help expedite the evaluation.

The key concept is that the simulation tools give you the information you need to focus improvement in the next iteration. However, adopting simulation does mean that you will need to change your business design information gathering activity to include the data that the simulation tool needs to run.

For companies that do not measure performance or transaction volumes, etc., this will require that the team starts with estimates that have been worked out with the business managers and staff. At the time these estimates are created, the team should also work with the business user to begin to put measurement activities in place. This will allow the business managers and staff to get used to the idea of being measured and give them input into what will be measured, how it will be measured, and how the measurements will be evaluated.

The metrics will then be aligned to the business activity/flow models at the point they apply and will be used in the simulator to help determine path load capabilities, decision exit probability, and other workflow efficiency measures. This will, in turn, tell the simulator what it needs to know to determine bottlenecks, redundant activity, and more. These "issues" in the simulation report will then guide the team in looking at corrective actions that would change the To Be design’s workflow, timing, IT support, etc.

These simulation reports can be used as input into the team for the next improvement iteration – focusing further design change to the exact points where something needs to change to improve the efficiency of the work and flow. In this way, guesswork is taken out of the efficiency design process and both time and money are saved, while the overall quality of the efficiency design is improved.

Thank you for reading my column. As always, I hope it made you think. Please contact me with any questions, comments or suggestions related to this column. I can be reached at daniel.morris@wendan- consulting.com or 630-290-4858.


RECOMMENDED