Move over SOA, data binding shows promise for workflow management

SOA and BPEL have often been used to improve agility, but data binding could prove to be a good alternative.

When componentizing software to encourage reuse or improve its agility, there is a tendency to separate it from self-organizing principles. A decoupled strategy needs to then be introduced to guide the work through a chain of components for processing. SOA and business process execution language (BPEL) have been used to support such workflow management, but other approaches, such as process or data binding, show promise.

There are three challenges for architects:

  • Understanding the data-driven model
  • Coping with components designed for other workflow strategies
  • Optimizing data for process management effectiveness

Controlling state with a back-end model

Data-driven approaches to component binding are extensions of explicit back-end state control. If SOA components are not inherently stateful, then state must be provided by the caller (the RESTful model) or stored somewhere to be accessed by components when they run. Client-side state control is increasingly popular, but it depends on the browser or user to set context in a workflow. That can be problematic when load balancing is done across copies of a given component. Stateful load balancing can work, but it can be complicated.

In the back-end model, application activity is represented by a data model that contains the information needed to establish context for the components as they are run. This approach can be used to set the context or state for RESTful applications. The back-end model works even if different stages of an activity -- different states -- are processed by other component copies. As long as the back-end model is available, the components can set state from it.

If a back-end model is available, it's not a great stretch to imagine it can not only contain the appropriate state information but also carry contextual information about the workflow itself. The data model now describes how work is routed through processes. There are two general approaches to implementing data binding, one based on rules, the other on state or event analysis.

Using a rules-based approach or a state or event approach

The rules-based approach amounts to a kind of centralization of distributed BPEL elements. The data model is used by an applicationwide business logic element that looks at the information in the model and invokes the appropriate step or component. This means that any application consists of a data model, the business logic element and the list of components, which can be shared with other applications. It's not hard to adapt to this approach, but many architects think it offers little benefit compared with BPEL-coupled message-bus models.

The problem with any data-model-binding approach is the limitation of preexisting components.

The state or event approach is borrowed from the finite-state machine model often used in protocol handlers. For this approach, the architect defines the application as a series of applicationwide states that differ from traditional SOA states in that they apply to the workflow context, not the component's own context within the flow.

For example, a simple application might have states like GettingAccount, FindingAccountHistory, PresentingData and UpdatingData. Everything that causes a change to the application can be considered an event. The way a given event is handled in each state can be visualized as a simple table. At the intersection is the identity of the process to be run and the name of the next state. This model is efficient enough to be used by protocol handlers, so it has proven high performance.

Overcoming common obstacles

The problem with any data-model-binding approach is the limitation of preexisting components. It's easy for an architect to drive component development to support such a model, but it's harder to impose it where components already assume more traditional SOA workflow.

Why data binding is hard may be surprising. It's not that data binding works with arbitrary component interfaces, but that the application state-event structure may be hard to define when component behavior can't be set.

Any component interface can be accommodated in data-driven binding as long as the data model includes all the information needed to invoke the component. An architect can wrap components in whatever adapter design pattern is needed to harmonize the interfaces, and invoke the adapter in a standard way from the data model.

The challenge is that some significant changes in overall application state may involve alterations that occur inside a component. For example, if a component displays data and accepts an update within itself, external separation of these two states is impossible. That means the application can't be used with most data-driven approaches. However, it's also true that embedding business logic within a component is poor design. Its use would inhibit optimum reuse and make BPEL orchestration of the application difficult because of the presence of embedded business logic. Under those conditions, it's best to change the application or components to adhere to best practices for Agile development.

Making sure the data model supports the collective needs of the components is critical. In any form of data-driven component binding, the process of application design has to pay more heed to issues of the data model.

It's easiest to think of an application as a series of transactions and presume each transaction is supported by a workflow. The goal is to establish a data model that includes the information needed for a single transaction to navigate that workflow. That model must be maintained throughout the workflow so that transaction context across components can be maintained without reference to other information.

Many architects find it helpful to think in terms of a transaction metamodel that serves as the information exchange point among components. Properly designed, this will allow for component replication for load balancing and can be used to manage data recovery and resynchronization in the event of a failure.

Data-driven component binding can eliminate a considerable amount of processing overhead associated with message-bus and BPEL integration, but it also generates a lot of work and may reduce application flexibility. Architects need to review the benefits and risks carefully to make the optimum choice.

About the author:
Tom Nolle is president of CIMI Corp., a strategic consulting firm specializing in telecommunications and data communications since 1982.

Follow us on Twitter at @SearchSOA and like us on Facebook.

This was first published in June 2014

Dig deeper on Service-oriented architecture (SOA) implementations

Pro+

Features

Enjoy the benefits of Pro+ membership, learn more and join.

Related Discussions

Tom Nolle, Contributor asks:

Do you struggle to optimize data for process management effectiveness?

0  Responses So Far

Join the Discussion

0 comments

Oldest 

Forgot Password?

No problem! Submit your e-mail address below. We'll send you an email containing your password.

Your password has been sent to:

-ADS BY GOOGLE

SearchSoftwareQuality

SearchCloudApplications

SearchAWS

TheServerSide

SearchWinDevelopment

Close