Lines are blurring in the area of Web services testing, according to industry experts. Runtime questions are blurring
into design-time questions, and the traditional wall between developers, software architects and their QA colleagues is diminishing.
Testing Web services by themselves, or software with a simple Web services wrapper, is straightforward, said Jason Bloomberg, a senior analyst at ZapThink LLC. "The challenges come when you start thinking about Web services in the context of a service-oriented architecture," he said.
"In an SOA with multiple services, you look at meta data associated with those services -- whether that's a registry or a repository of services information -- and you test the functionality not of the individual service, but whether you're handling versioning properly, [for example]." "These runtime questions blur into design-time questions."
Traditionally, the software world thinks of design as development-centric, and runtime as operation-centric, Bloomberg said. "With SOA that distinction blurs: You're updating services on an ongoing basis, you have all the issues of operations testing and management, you're still doing development work and now development work breaks into infrastructure stuff. You're still doing component-based testing, but you have application development done at the process level, with composite applications built out of services."
Frank Cohen, founder of Campbell, Calif.-based PushToTest, which provides open source Web services testing tools, agreed that "testing Web services or SOA is a different animal. The functionality of a system or service under testing can change depending on the load, which is very different than if you test an application on a local machine. Each time you test in an SOA environment, you are challenged to make sure the system is in a state that's testable -- and you're not in control of all the components."
As a result, Cohen said, software developers and architects in the SOA space for the first time are following the development of a service through architecture to deployment, and playing a new key role in recommending a test approach and the test approach QA will use.
"This is forcing developers to buddy up much sooner to the people putting a stamp of approval on it, and sometimes the developers are taking on the responsibility themselves," said Jonathan Rende, vice president of product marketing for Mountain View, Calif.-based Mercury Corp. The trend, he said, is a confluence of three factors: the notion of extreme programming, the distributed forces of SOA and the value it brings to an organization, and the ubiquity of developers who are Java-centric or .NET-centric.
Time savings is part of the reason for this trend, said Curtis Williams, senior developer at SOA consulting firm TeamSOA. "Development testing should be something that's precise, dedicated to finding a particular issue with a line of code, or unit testing. Unit testing and runtime testing differ. Runtime testing is part of a customer scenario; it might be something the developer can do, but development isn't really the environment that should be burdened with doing runtime testing. I believe that's a QA function. But the line is blurring, mainly to compress scheduling."
Still, he added, "there are extreme advantages to [eating] your own dog food. By running your services through your SOA [during development], you're doing lot of validation that customers will appreciate."
While IT organizations and test tool vendors have traditionally looked at design-time testing and runtime testing as two separate problems, there are some tools that cross the line.
PushToTest, for example, offers the open source TestMaker framework and utility to build intelligent test agents. Cohen said TestMaker is designed to unite developers, QA and IT. TestMaker integrates with JUnit, an open source Java testing framework, which can be used for system-level functional tests. PushToTest said QA can use these same tests for scalability and performance, and IT can run them to monitor quality of service.
"So a new interaction is happening," Cohen said. "Software architects and developers are hanging around throughout the whole service development lifecycle."
Mindreef's SOAPscope, a Web services diagnostic tool, is also being adopted by both developers and QA, said the company's chief technology officer Mark Ericson. "Development-style testing will be more the unit type of testing or the individual service; QA may be testing more aggregate services or a composite application."
Business analysts have been able to use SOAPscope as well for proof-of-concept testing, he said. "Some people really like it for ad hoc testing. You can connect to any Web service and try out various scenarios effectively without having to write out test scripts."
What will be the best approach to testing Web services in a SOA will ultimately evolve, said TeamSOA's Williams. The notion of testing will change as "Mindreef, Parasoft and others see the market with the idea of creating quality test organizations that understand Web services; it may be a necessity to push runtime testing into development."
Bloomberg said: "[Ultimately,] design versus runtime [testing] is two sides of the same coin. You have to make sure the code is working, but you also have to make sure the services and operation are working, too."