The infamous Y2K bug showed the way. In the late 1990s, the hunt was on for date fields lurking in applications that might not transition properly to the new century. To keep the hunt from devolving into the world’s most labor intensive search for needles in haystacks, innovators and entrepreneurs fielded an array of tools, and some of them are still around.
You can't fix what you can't identify.
Today, for efforts to modernize legacy systems through services, the challenge is similar to the Y2K problem: You can't fix what you can't identify. And, the same tools or similar tools to those marshaled in the 1990s, are proving useful for verifying SOX compliance, for implementing “mass change” (such as expanding ID numbers) and for making preparations to update legacy applications.
Phil Murphy, an analyst at Forrester group categorizes most of the relevant applications as application mining tools. “Basically, they parse source code and can tell you whether there are data dependencies and processing calls.”
Still, Murphy says it is important to understand what tools can and cannot do. “In the late 1980s and 1990s when components were the future, vendors sold these tools as a means to take your old COBOL and pull out the branching logic, which they defined as business rules, and then throw out the shell. Businesses that relied on that promise hurt themselves,” he says.
But application mining tools can be a very powerful starting point, providing rapid and extensive information about legacy applications that would otherwise be difficult or impossible to get. That makes them practically indispensable for those seeking to modernize older apps.
A second category of tool, less sophisticated than application mining but still potentially useful for legacy updates, are web-to-host applications, which Murphy notes used to be called pejoratively, “screen scrapers.” These tools don’t assume you will change your source code but merely provide a new interface for the code you’ve got. “If you roll the clock back to the first e-business boom, brick and mortar companies – with their legacy CICS "green screen" interfaces – were getting slaughtered and they wanted a way to get to the internet fast,” says Murphy.
One such solution evolved rapidly to produce an effective way to translate the highly formatted 3270 data stream into something that could be fed to a Web interface. According to Murphy, “The first ones were very brittle and didn’t work too well but they got better and the ones that are still around are quite powerful.”
Murphy says the limits to the technology are primarily the limits of the underlying application. For instance, he notes, if an order entry system was designed to serve 1000 internal users, each with their own mainframe ID, it can be hard to expose that capability to a public that could number in the millions. One route is “session pooling”, which can work, “but security administrators don't like that,” he says.
Screen scrapers are available from companies like Red Oak Software and Rocket Seagull, which also provide other tools for application update and integration projects. “If you need a quick and dirty solution, the “web-to-host” tool is good but if you need something more sophisticated, definitely consider an application mining tool,” Murphy adds.
This was first published in February 2011