It really doesn’t seem so long ago, but the IT world was quite different back when I was studying computer science at Cambridge in the mid-1980s. In those days we didn’t have tablets or laptops let alone mobile phones. You might say the coolest technology around was a boom box that played cassette tapes. We only had one BBC microcomputer in our esteemed college connected to the university’s central IBM 370 mainframe and it was kept tightly under lock and key. Every time I needed access to make updates to my applications, I had to (politely) ask the college porter for the key.
There were only a few of us studying computer science under our head of department, Roger Needham, and one of our more memorable projects was to write a compiler in IBM 370 assembler. To be honest, I never imagined that 25 years later we’d still be tackling problems with programs that were first written back in the heyday of Ronald Reagan and Maggie Thatcher.
Organizations that have had mainframes for a long time generally have applications that their business depends on. These applications, originally written on state of the art hardware, have been modified and ‘improved’ over the years by different developers and the result is that they can appear to be dauntingly large. The thought of trying to modernize these behemoths and move them to a more flexible x86 platform can be overwhelming.
The good news is that these problems are not insurmountable. Through detailed, automated analysis of mainframe applications, we’re finding that there are often large sections of duplicate or even dead code in these ancient apps. This means there is a lot less ‘real’ code that needs to be translated or rewritten.
In fact, there are often routines that were custom coded when the original application was written that should, today, be replaced by modern off the shelf packages. For example when an application was written in the 1970’s the developers may have created a custom report writer within their application because there was nothing suitable that was available at the time. However that section of the program could now be capably replaced by any one of the many existing off the shelf report writers. This type of replacement can reduce the size of the old applications even further.
To explore these growing opportunities for improving and modernizing legacy mainframe applications, you need something like an X-ray (see image above) to reveal a clear picture of an application’s complexity – as well as the amount of duplicate or dead code hidden inside. Remember: finding out how large the mainframe application modernization problem really is before you get started down the path is an all-important first step.
To gain more insight on this topic, I strongly encourage you to watch and listen to a recording of the recent EMC Global Services Mainframe Transformation webinar presented by Steve Woods. It’s a great resource to understand what goes into planning and executing an application modernization initiative, what factors will affect cost, and how to begin.