What's the biggest misconception people have of Web services and service-oriented architecture? One of the top...
By submitting your email address, you agree to receive emails regarding relevant topic offers from TechTarget and its partners. You can withdraw your consent at any time. Contact TechTarget at 275 Grove Street, Newton, MA.
level things is that in many respects this technology is being developed from the bottom up, so we're getting more sophisticated at the protocol interoperability aspect of Web services. As an industry, we've now converged on an understanding that it's about how applications message one another, but we don't understand nearly as well how to actually build those applications.
It's kind of like me going out and buying a very sophisticated woodworking tool and reading the manual for it. Now I've actually got to go design some furniture. That's a different thing and I think we're not nearly as sophisticated as we need to be. We don't understand how to build the services as well as we might. What's one thing that has surprised you in a good way about Web services?
The industry has clearly needed to exploit the Web for application integration. One of the surprises was the route that it took to get where it is. One of the things that's interesting is that it went through this RPC [remote procedural call] phase that I didn't expect. I expected it to immediately go to the messaging phase, but it seems to have worked out in a way. It needed to go through the RPC phase to bring developers along. Once it got developers and the community moving in the right direction, which is Web application integration, it actually made an important adjustment, which is to shift from the RPC model of integration to the message-based model of integration. Is it a good thing when technology takes some turns the vendor community didn't see coming? Does it shake out new ideas?
We have now entered the phase where the computing model is a global model, and our job as vendors is to support that global computing model and support our customers being successful in using it.
Before the Web in the broadest sense, we didn't have such a thing so we were all out trying to solve things in our different ways, whether it was TCP/IP vs. other technologies at that level in the stack, or other things. This coalescing around the global computing model is very helpful because it allows us to be very helpful in getting value out to our developers. About that global computing model, are we hitting the point where we need to rethink how each layer functions and what you're capable of doing inside of each layer?
If you're looking at application integration, we're coming from a phase where we relied on the central coordination technology, whether it's a message-oriented middleware product or some other piece of middleware, as kind of the point through which everything is connected to the global model. This in effect says that you can't rely on any connecting fiber other than what the global model provides.
So when you write an application, you need to write it to the global model, which means that you can rely on HTTP, HTML and IMAP [Internet Message Access Protocol] and a number of the services that are ubiquitous, but you can't rely on more than the elements that are being integrated. If you do, you're creating a discontinuity that's hard to integrate through.
If you write an app that's directly tied to the particular message-oriented middleware product, your customers or your suppliers may not be using that same product. They may not be on the same platform you're on. They may not be on the same operating system. You have to ensure that you're writing to a model that only relies on what is available globally. Then you might optimize that by taking advantage of some local facilities.
There are some unresolved issues about what is and isn't within the global model, which includes what you expect your endpoints to understand from a protocol perspective. Those will be worked out. We're converging more and more as we progress. Web services use a lot of layered envelopes. Will that ever change?
My guess is as developers begin to implement services and actually put these into production, they will figure out how much of the protocol they need and how much they push up into their applications.
The details of the protocol aren't as important as what the developers decide they're going to use. For instance, would you store a SOAP message in a database? Likely not. You wouldn't make that part of your operational business data. You use it to communicate that data.
The messages that you've architected and the conversations that you've implemented in your services, their purpose is to establish this flow of information. What's really important at the application level is what are those schemas and those conversations so you can get useful work done and how do you implement those on either end? To some degree, the protocol is incidental.
As developers, we don't worry about the details of TCP/IP. In previous lives we didn't get worried about the internal details of a MOM [message-oriented middleware] protocol. We worried about how we used those in a larger context. I'm sure there were features of MOM systems that the MOM folks felt were core elements that were never actually exploited by the developers because they decided that they didn't actually need those features. I think the same thing will happen with Web services. We don't really know how that will all work out, but there are likely aspects of Web services protocol that will be ignored by developers and some that will be critical. Is Java vs. .NET spurring innovation or creating an integration problem that may never be fixed?
To be more specific, if you look at SOA, in the past, when it was dealing with middleware, they just had to take what their vendors gave them and deal with the hanging strings and try to fit them together as best they could. Today, to some degree, what SOA represents is the global model built on open standards, and potentially IT has the ability to set for itself its specific view of those standards. Not that IT is going to develop them on its own, but it will decide what kind of computing standards that it expects its vendors to conform to.
It will be up to vendors to adequately support those standards so that when IT wants to integrate across those vendors, it doesn't hit stumbling blocks that it's not expecting. There'll be pressure to properly deliver services. To me, that's the promise of SOA, in that IT will be putting pressure on its vendors to conform. Certainly, vendors will continue to compete on quality of implementation.
A change has happened and most people haven't actually seen it yet. They don't realize that it's actually happened. You can't come to somebody with proprietary technology and say that you've got some magical solution that doesn't fit the global computing model. Nobody wants it.