Business Process Management (BPM) tools have evolved over time and they have been influenced by various points of view. Early process orchestration tools had a distinct IT runtime perspective,
With the advent of BPMN 2.0, there has been a perceptible shift in emphasis toward the business processing modeling tools. But, without a shared understanding of what a process is, both IT-side and business-side teams are likely to falter in creating top notch BPM systems using BPMN.
A first step toward such an understanding may require admission that process, in the words of Derek Miers, analyst, Forrester Research, signifies different things to different people. According to Miers, some people are referring to 'procedures' when they discuss processes – procedures being sequences or steps that are part of a BPM. When others discuss processes, they actually may be referring to "practices" – or, approaches to work. The distinction is especially significant given the generally felt the need today to build flexible systems rather than rigid ones.
An emphasis on existing procedures can hamstring some BPM efforts. This happens when people work hard to improve what they’ve got rather than putting effort into developing something that they truly need, says Forrester's Miers.
The reality, he notes, is that “process” is often just a proxy for an organizational chart and “whatever functional decomposition there is, is really just a mechanism for apportioning blame.”
According Miers this approach creates “kebabs” – the familiar bulges on organizational charts and process diagrams.
“When I ask people if they have ever seen a process they say 'yes' and start describing a flow diagram,” he says, while adding, "That’s not really a process."
Modeling suffers from the same kinds of thinking, he says, namely a tendency to leave out a great deal in an effort to make something that is tidy and comprehensible. Miers says the fundamentals should include the order of activity, who controls the tasks and, in addition, information about collaboration, synchronization, and other factors.
Miers says models are used to communicate, but many people think, if they put all the processes they have modeled together, they will be able to reuse them, as if by magic. “That’s just rubbish,” he says.
What too often ends up happening, according to Miers, is that people tend to over-engineer a process in order to make it applicable in several use cases. "But it ends up not being useful to anybody,” he says.
His solution? Discard old Newtonian cause-and-effect approaches and accept chaos and constant change. Instead, consider the Metcalfe Law predicate, stating that the value of a network is proportional to the square of the number of connected users. In other words, don’t try to create an airtight, final model – focus on adaptability and evolution.
“If you look at the models and what they are modeling you will usually find that systems and processes change all the time, so after six months or even three months, the model needs to be revised,” he says.
In a less philosophical vein, Miers also offers advice on tools. In his view, the more standard and the more widely used ones are the better choices. “There are many vendors selling tools on a shaky business model – they charge a lot and they have very few customers,” he says. These tools, mostly built around repositories that must be populated, are time consuming to use, too, he says.
This was first published in May 2011