Service virtualization arises to meet services testing obstacles

Service virtualization allows teams to simulate services before they are truly available. That’s a plus for Agile development and integration testing.

Service virtualization underlies an emerging class of tools for modeling SOA components. Using a new form of simulation, development teams can now work with representations of software services long before they are actually available. 

That addresses some of the biggest bottlenecks in creating high-quality composite applications, and it helps would-be Agile development efforts that must keep up with fast-moving Web applications

“The idea of virtualization playing into the application development lifecycle is the most exciting technology we have seen for a while,” said Theresa Lanowitz, founder and analyst at Voke Media. “It will enable IT organizations, including the development, test, and operations teams to manage the classic cost, quality, schedule triangle.”

These tools allow software development teams to test out the performance and behavior of new services in the context of the existing or planned services with which the new code is expected to interact. This lowers the barriers for testing code under development, making it easier to find defects earlier in the development process, and opens the door for agile development of SOA applications.

Parasoft claimed a product to embody service virtualization concepts with SOATest in 2003. The term was coined in 2007 by iTKO with the release of the LISA Platform. In recent years, major enterprise software vendors including IBM, CA, and HP have developed or acquired service virtualization tools, poising service virtualization to potentially become a mainstream tool type. IBM recently acquired Greenhat, CA acquired iTKO, and HP has developed its own tool in-house.

Lanowitz said, “The market is ready to hear what service virtualization can do to help manage cost. It is being deployed in financial services, retail, telecommunications and other organizations with a high degree of composite apps. It is useful where the test organization may not have access to a mainframe.”

Stubs, mocks and service virtualization

Developers have traditionally created stubs and mocks to represent the services that new code would interact with. This approach took the developers attention away from writing code thus reducing their useful output. Peter Cole, former CEO of Green Hat and now IBM Software Group's Director of Quality Management, Rational Software, said, “When developers test against the mock, they code-in their assumption of how the service is expected to behave, rather than how it really is. We are finding that the nastiest integration problems are caused by bad assumptions.”

Middleware integration is an increasingly complex endeavor. When the new code is finally ready for testing on the live platform, the developer or tester has to manage the other services that interact with this code. In many cases these services run on platforms with which the developer is not familiar. “There are developers that spend 30 to 50% of the time getting the test ready,” said Cole. On the other hand, he continued, “virtualized services are easier to implement and manage. A developer can build several virtualized services in less than 15 minutes.”

Service virtualization models the behavior of services that code might interact with. This is far simpler than modeling all of the logic of the underlying code, said John Michelsen, ITKO founder and now a distinguished engineer at CA Technologies. “You have to build a piece of software that so matches the behavior of another piece of software that no one knows the difference between the simulation and the real thing. It has to offer the same interface, but does not need to support all of the underlying logic. 

"A customer may have a billion lines of COBOL, which they do not understand,” he said. With simulated or virtualized services, they don’t need to understand those billion lines of code, he continued, they just need to know how it interacts with other applications.

Service virtualization tools can remove several constraints in the development process around the availability of testing hardware, services, and privacy, said Wayne Ariola, Vice President Strategy Corporate Development at Parasoft. A recent Parasoft survey found that organizations have an average of 8 to 10 dependencies associated with an application under test. At any point in time, a developer may have access to only 30% of those dependencies.

With service virtualization, Development teams can also test against the specified behavior of new services, rather than waiting for the code to be developed. This reinforces the trend towards more Agile and highly-iterative development techniques. Service virtualization tools also allow organizations to model and mask sensitive data, making it possible to test against services with HIPAA, SOX, and other privacy constraints.

The move to cloud computing may further accelerate use of service virtualization, said Kelly Emo, director, Applications Product Marketing, HP. “For example, a credit service or an inventory service could already be stressed when new business requirements come in for a part of the composite application to move out to the cloud,” she said. “From a development team perspective, these are very rapid fire changes on existing shared services.”

A worse scenario is waiting. In fact, development teams often get into a wait state, said Emo. “They need to do end-to-end testing but the service may not be available yet,” she said. These delays cause teams to miss schedules or do less-than-optimal testing. An HP Services Virtualization product seeks to address such bottlenecks.

The Bank of America, was an early adopter of service virtualization, and has used to the technology to increase testing coverage and improve code quality. Burt Klein, former Performance and Resiliency Testing Executive at BOA said the organization had been considering a very high-cost testing platform for two applications using traditional testing approaches. They were able to deploy a new service virtualization testing infrastructure from iTKO for less money, and it has grown to support more concurrent releases.

Klein said the platform allowed much better test coverage for performance, negative, and operational testing. They could simulate the performance of individual services that new code would interact with in order to help identify different breaking points, and the associated behavior. This knowledge helped the operations team to identify the underlying choke points when failures occurred.

The tool infrastructure also dramatically improved the ability to find and resolve defects faster, said Klein. In this case, a defect was considered an outright failure or a missed service level agreement for an application. After service virtualization was first deployed, the defect rate dropped from 18,000 failures per million opportunities to 250 failures. Further refinement eventually reduced the defect rate to 0.001 failures per million opportunities.

Klein was so impressed by the potential for service virtualization that he left the bank to form a Service Virtualization community to help build awareness for the technology and as a forum for sharing best practices. He explained, “I see this as the game changing event in software development in my lifetime.”

Dig deeper on Enterprise Application Integration (EAI)

Pro+

Features

Enjoy the benefits of Pro+ membership, learn more and join.

0 comments

Oldest 

Forgot Password?

No problem! Submit your e-mail address below. We'll send you an email containing your password.

Your password has been sent to:

-ADS BY GOOGLE

SearchSoftwareQuality

SearchCloudApplications

SearchAWS

TheServerSide

SearchWinDevelopment

Close