3 Tips for Using Data During the Operational Excellence Process ft. Serco
What would the decision making process of your organisation’s Operational Excellence programme look like without data to support it? Is the data you are currently using the right data? How can you take your programme to the next level by rethinking your current data-strategy?
In conversation with Operational Excellence veteran Paul White, in a recent content piece we explore the importance and place of data in achieving enterprise-wide improvement and also share three actionable tips for successfully using data as a driving force in your organisation.
Paul White is the Operational Excellence (OE) Director at Serco. With over 20 years OE experience gained within some of the world’s largest organisations, he has extensive OE knowledge across a breadth of industry sectors.
Paul was a member of the advisory board for OPEX Week Europe, and on Tuesday 8th of October 2019 he participated in a panel discussion that evaluated the importance of data analytics within core business strategy. Here, Paul shares three Tips for using data during the Operational Excellence process.
It is important to verify data integrity to avoid GIGO; garbage in, garbage out. i.e. If bad data enters the system the analysis results will be misleading and will induce poor decision making.
In a manufacturing environment GIGO is avoided by conducting Repeatability and Reproducibility studies to ensure that the measurement system is robust. In a service environment data validation ensures robust data entry and operational definitions ensure consistent data collection.
Data-driven project selection
Use data to drive project selection. It is critically important to ensure that OE is focused on the highest priority areas for improvement. This ensures that OE projects are supported by Senior Leadership to sponsor the project, allocate resources, commit to improvements and remove roadblocks.
I experienced OE practitioners working on projects that were sporadically selected based on ‘who shouted loudest’. The result is that projects progress slowly, suffer many set-backs due to poor sponsorship and in the worst-case scenario, deliver no improvement at all.
Therefore, conduct Pareto analysis to drive data-driven project selection and separate the ‘vital few from the trivial many’.
Incorporating data science techniques within Operational Excellence.
This a relatively new and exciting evolution which combines well with Operational Excellence to gain process insight from data.
We are currently conducting a Process Mining proof of concept to understand large data sets stored within IT workflow systems. Process Mining is a powerful technique which is very applicable for the Service sector. I call it ‘Lean for IT systems’ because it creates a pseudo value stream map of an IT process to identify flow, bottlenecks and constraints. We are at the early stages of our Process Mining proof of concept, but it has already produced some interesting results; notably the quantification of internal rework loop that was ‘hidden factory’.
We are also using Microsoft Power BI to create dashboards to visualise OE performance across different Business Units. This approach provides an interactive interface that enables the user the drill-down from overall performance into their area of interest.
The key advice for incorporating data science into OE is to start with the problem to be solved. Data science provides many analysis opportunities which can over-excite people at first and they produce many fancy charts, but what problem are they trying to solve? This mistake is similar to John Seddon’s ‘Lean tool head’ principle which states that people ‘blindly apply lean tools, without thinking about the problem to be solved’.
Therefore, start with the problem first and then choose the appropriate data science technique to solve the problem.
Regardless of your industry, it is critical to understand how you can best use data to achieve excellence within your organisation.