Bringing data and process together - the future of process excellence: Interview with K2's Rob Speck

Contributor: Rob Speck
Posted: 08/21/2014
Rob Speck
Rate this: 
Average: 5 (1 vote)

In PEX Network's latest survey on technology investment plans of process professionals, Big Data and analytics has emerged as the number one investment area in the year ahead. How do data and process fit together?

Rob Speck, Vice President, Global Services, at BPM software provider K2 weighs in.

PEX Network: What role do you see technology playing in supporting companies as they move from tactical, quick-win process improvements to improvements that really span the end-to-end processes across the enterprise?

Rob Speck: I always smile when I hear the phrase end to end followed by a specific technology like SAP or Salesforce. The idea that any of these technologies handle true end-to-end processing is a bit of a corporate urban myth. Perhaps when we have artificial intelligence become an affordable reality, that’s where we will see end-to-end process automation. But the fact is that people are currently - and will likely remain- central to process execution during our lifetime.

That said, technologies that allow organizations to automate specific parts of end to end and fill the gaps that these monolithic technologies cannot serve are having major impacts on organizations, large and small. Over the last two years I've worked with companies as diverse as Microsoft and Chevron, Wells Fargo, as well as state and local governments and even smaller organizations like law firms, and they're dealing with very specialized process areas, such as customer intake, employee onboarding, equipment management and other such processes.

When you look at each of these process areas, they're typically deep within what we would call true end to end. While our technology and others can automate those areas, they're subsets to massive end-to-end processes, like procure to pay and order to cash.

So organizations may start small and focus on specific low-hanging fruit. By building out a variety of these business applications and then realized, hey, I've got something. Once you do that, you have an ability to look at it and say, I've got these business apps that I'm able to build quickly and easily, get them deployed and impact the business quickly.

The next step that some companies take is to take a look at larger platforms and create standards to build process automation. But the effort that's usually there to implement and manage these large BPM platforms is usually tremendous. It takes organizations over a year in most cases to get any value out of those investments, and the cost of gaining the requisite skills and maintaining those skills are often overlooked that can result in extraordinary consulting or contractor costs.

Some of the most innovative and lean organizations I've had the pleasure to work with during the past few years have seen the benefit of building business applications rapidly and then leveraging their best practices and technical assets centrally to benefit all process areas.

PEX Network: Why do you think companies are starting to look to invest in big data and analytics in the year ahead? Why now?

Rob Speck: I think it's a combination of factors, including pervasive connectivity devices that send data wirelessly about everything: you've got machine states, customer locations, atmospheric conditions and on and on. You basically have all of these devices sending massive amounts of data. Today, we can monitor almost anything with embedded sensors that simply send data at unprecedented frequencies. If you couple those trends with ever-cheaper costs of storing data, we have a world full of data orders. We're trying to constantly glean from some truth out of all this information that we now can store.

When we look at that exponential growth you can store in the enterprise, we can see that there's value to having tools that can understand what's going on, perhaps trends, correlations, patterns, that can actually help aid in future decisions.

PEX Network: How do you see big data and business process management actually sitting together?

Rob Speck: It's still a very early growth stage. There are a variety of organizations that have technologies that can handle data flowing in using event-enabled processing, using services-oriented architectures. You've got things like in-memory processing. And you have other organizations that are storing everything that is coming in. Regardless of your architecture, regardless of the way you're handling the big data challenge, at a certain point you basically want to take action.

And if you take an example like Barclays Bank. If you’re at Barclays, you might see an inordinate amount of password changes with your online account. A certain amount might pass a specific acceptable threshold, which then triggers an alert to investigate as maybe there's potential fraud activity hitting accounts.

Next, you want some kind of action. It may be completely automated, like freezing the password change function across your system so no further changes could happen. But you may have other more complex set of activities that are manual as well. You may have roles in the organization that are alerted to specific human tasks that may be taking place, perhaps exploring the root cause of this specific suspicious condition.

In short, when we talk about value of big data and various technologies, at the end of the analysis you want to trigger a process. And this is where we've seen our technology become an important part of harnessing the power and value of big data.

PEX Network: Now, there's always a risk with new technology like big data that companies just jump on the bandwagon and then become disillusioned when the reality doesn't live up to the original expectation. Do you have any advice for how process improvement professionals can help ensure that their companies don't fall into this "trough of disillusionment" with big data or any other technology investments?

Rob Speck: I've seen that curve that Gartner draws out, and it does happen with nearly every technology wave. In my view, it's all about the business drivers: what are you trying to achieve? Where you've identified opportunities in the business, how can insight into things like customer behavior, operations management quality, demand generation, help these things be improved?

I've seen companies like Federal Express use environmental monitoring to ensure biological samples stay within a necessary temperature range during transport from clinics to labs for testing. For this instance, they knew that they had a need to monitor conditions more accurately and with a constant flow of data monitoring because that would reduce the incidents of fault, which could ruin a shipment, which in turn can cost the company millions, of not just lost revenue, but also lost customer satisfaction and loyalty. So once you understand the business case and that business case is there, as with any technology, it's about taking a pragmatic approach.

Another example is Microsoft. They started in one specific business area and after proving these technologies, the architecture, the approach, they moved to another business unit and so on. One of the impressive things I've seen working with Microsoft is they approach every new technology in a sensible way and I've never seen them take a boil-the-ocean approach. They realize the value quickly, expand, take a step back, look at how to leverage that across business units, and then centralize if, and only if, it makes any sense.

So in short, I'd say use technologies that allow you to gain value in just a few months. Stay away from massive platforms that promise the moon. Do that, and you should avoid that trough of disillusionment.

80% of financial organizations cited lack of clarity about Big Data!
If you're looking to to make Big Data work for your organization to create better value for your customers join your Big Data peers online 20 - 24 October 2014.
Find out more about what will be discussed.

Rob Speck
Contributor: Rob Speck