Issues and Challenges Facing Legacy Systems

Maintaining and upgrading heritage systems is one of the most hard challenges CIOs face today. Consistent technological modification frequently deteriorates business value of legacy systems, which have actually been established for many years with huge financial investments. CIOs struggle with the issue of modernizing these systems while remaining their performance undamaged. Despite their obsolescence, legacy systems continue to provide a competitive advantage through supporting distinct business processes and containing indispensable understanding and historical data.

Regardless of the accessibility of even more affordable technology, about 80 % of IT systems are working on legacy platforms. International Data Corp. estimates that 200 billion lines of heritage code are still in use today on more than 10,000 large mainframe websites. The trouble in accessing heritage applications is reflected in a December 2001 study by the Hurwitz Group that discovered just 10 % of business have actually fully built-ined their most mission-critical company processes.

Driving the requirement for change is the expense versus the business value of legacy systems, which according to some industry polls stand for as much as 85-90 % of an IT budget for operation and upkeep. Monolithic legacy architectures are antitheses to modern distributed and layered architectures. Legacy systems perform business policies and decisions that are hardwired by rigid, predefined process streams, making integration with customer relationship management (CRM) software and Internet-based business applications excruciating and occasionally impossible. In addition, IT divisions discover it increasingly hard to work with developers qualified to deal with applications written in languages no longer discovered in modern-day innovations.

Be lean. Be agile. Work together. Download Now
Numerous alternatives exist for improving heritage systems, defined as any monolithic information system that’s too hard and pricey to modify to satisfy new and continuously changing company demands. Techniques range from quick fixes such as screen scraping and legacy wrapping to long-term, but more complex, options such as automated migration or changing the system with a packaged item.

A Short History

Debate on legacy modernization can be traced more than a decade, when reengineering specialists argued whether it was best to migrate a big, mission-critical information system bit-by-bit or all at once.

Rewriting a heritage system from scratch can develop a functionally comparable information system based upon modern-day software techniques and hardware. But the high risk of failure connected with any huge software project decreases the opportunities of success. Researchers from the pioneering 1991 DARWIN project at the University of California, Berkeley, listed numerous elements working against the so-called “Cold Turkey” technique:.

Management rarely authorizes a major expenditure if the only result is lower maintenance costs, instead of extra business functionality.
Development of such enormous systems takes years, so unintentional company processes will have to be contributed to keep pace with the changing company climate, enhancing the danger of failure.
Documentation for the old system is often insufficient.
Like many huge tasks, the development procedure will take longer than prepared, checking management’s patience.
And lastly, there’s a tendency for big tasks to end up costing far more than expected.
DARWIN promoted the incremental strategy, widely referred to as “Chicken Little,” since it divided a big project into workable pieces. An organization could concentrate on reaching certain turning points throughout the lasting project, and management can see progress as each piece was deployed on the target system. Market experts challenged this model several years later, stating the need for heritage and target systems to inter-operate by means of data entrances during the migration procedure included intricacy to an already complicated procedure. In addition, entrances were a considerable technical challenge.

Numerous migration projects failed because of the absence of mature automated migration devices to alleviate the intricacy and technical challenges. That started to change in the mid-1990s with the accessibility of tools from business such as Anubex, ArtinSoft, FreeSoft, and Relativity Technologies. These tools not just transform heritage code into modern-day languages, however, in doing this, likewise offer access to a variety of commercially offered components that offer advanced functionality and reduce development expenses. They assist break up a legacy system’s business knowledge into elements easily accessible with modern industry-standard protocols, a component being a collection of things that carry out specific company services and have plainly specified application-programming user interfaces (APIs).

Selecting A Modernization Method

The Internet is typically the driving force behind heritage modernization today. The Web can conserve an organization time and money by providing to customers and partners business processes and information locked within a legacy system. The technique utilized in accessing back-office performance will rely on how much of the system has to be Internet-enabled.

Screen scrapers, often called “frontware,” is an alternative when the intent is to deliver Web access on the current legacy platform. The non-intrusive devices add a graphical user interface to character-based mainframe and minicomputer applications. Screen scrapers run in the desktop computer, which is made use of as a terminal to the mainframe or mini via 3270 or 5250 emulation. Popular display scrapers include Star: Flashpoint, Mozart, and ESL. This technique provides Internet access to legacy applications without making any changes to the underlying platform. Since they’re non-intrusive, screen scrapers can be deployed in days and often hours. Nevertheless, scalability can be a concern because many legacy systems can not manage nearly as numerous users as modern-day Internet-based platforms.

Legacy wrapping is a 2nd non-intrusive technique. The technique constructs callable APIs around tradition transactions, offering an integration point with other systems. Covering does not offer a way to essentially alter the hardwired structure of the heritage system, but it is commonly used as an integration technique with Enterprise Application Integration (EAI) frameworks provided by business such as SeeBeyond Technology, Tibco, Vitria, and WebMethods.

EAI moves far from stiff application-to-application connectivity to even more freely connected message- or event-based techniques. The middleware likewise consists of data translation and transformation, rules- and content-based transmitting, and adapters (often called adapters) to packaged applications. Suppliers typically offer one of three system-wide integration architectures: hub-and-spoke, publish and subscribe, or business procedure automation. XML-based EAI devices are considered the modern of freely coupled contemporary architectures.

EAI suppliers promote wrapping as a method to tap heritage information while avoiding the suffering of trying to modify the underlying platform. This strategy likewise allows integration vendors to focus on the communications and connectivity facets of their solutions, while staying clear of the complexity of heritage systems. Like display scraping, covering strategies are applicable in situations where there’s no need to change business functionality in the existing platform. Nonetheless, none of the above techniques cover the high expense related to maintaining a legacy system or discovering IT experts going to work on obsolete technology.

Another alternative is changing an older information system with contemporary, packaged software and hardware from any among a range of ERP suppliers, including Lawson Software, Manugistics, PeopleSoft, Oracle, and SAP. This technique makes sense when the code quality of the initial system is so bad that it cannot be recycled. Nonetheless, deploying a modern-day ERP system is not a panacea. An organization either needs to customize the software or conform to its business procedures. The first option is essential if the initial system was tailor-made and offered a critical business advantage. Over the last number of years, significant ERP vendors have included devices to assist adjust their applications to a customer’s certain needs. However, personalization still lugs massive risks that the system will not have the ability to replicate a special set of company processes.

In addition, a packaged system needs retraining of end users whose efficiency will slow as they adjust to a brand-new method of doing their tasks. IT staff likewise will require training on the brand-new system. Finally, ERP applications bring substantial licensing charges that stay throughout the life of the software.

When Heritage Migration Makes Sense

Legacy migration is finest fit for business looking to carry out a brand-new business version, such as an Internet-based procurement or other B2B system on either of the two major platforms, J2EE from Sun Microsystems and partners or Microsoft’s. INTERNET. Both emerging development/deployment environments support XML and SOAP, requirements utilized in exporting and consuming Web services across heterogeneous platforms. Another justification for starting an intricate migration project would be the enhancing cost and difficulty of maintaining and modifying the old system.

The primary step in the migration process is the analysis and evaluation of the legacy system. Normally, this consists of taking stock of all application artifacts, such as source code, copybooks, and Job Control Language. A complete data source analysis is likewise needed, including tables, views indexes, procedures and sets off, and data profiling.

Database vendors, such as Oracle and IBM, supply devices that help automate the database migration, which is separate from the application migration. All source database schema and elements must be mapped to the target data source. Relying on the complexity of the system, from 80 % to 90 % of the migration process can be automated. However, there will constantly be problems with stored procedures and sets off that are indecipherable by an automated parser, needing manual tweaking.
Data source migration can add substantial time to finishing a project. For example, Grace Ships, a Christian charity organization headquartered near Tyler, Texas, moved its 4GL Informix application on a SCO Unix server to Informix’s Java-based Cloudscape data source working on Linux. The project was required to lower maintenance costs of the system utilized to track contributors and contributions. In addition, the brand-new system offered Grace Ships a contemporary development platform for customizing and including services.

Making use of an automated migration device, Grace Ships ported its 80,000-line application, called Collaboration, in less than a month. But the total project, including establishing 7 areas in Europe and the U.S. with databases and composing Java servlets for upkeep and replication, took seven months. If everything had actually remained on the same data source, then the project would have been finished in about a month.

Legacy Application Migration

In the early stages of the migration process, core company reasoning need to be determined and mapped out to reveal the interrelationships of the code performing the application’s business function. Program-affinity analysis can be done to produce call maps and procedure flow diagrams, which include program-to-program call/link relationships. These maps and diagrams make it possible to aesthetically recognize linked clusters of programs, which are good indications of associated business activity. Business offering tools to assist in the analysis and evaluation procedure include MigraTEC, Netron, Semantic Designs, and McCabe and Associates.

Once core company reasoning is determined and mapped, it can be broken up into standalone parts deployable on client/server and Internet-based environments. This process develops collections of programs that carry out a certain business function. In addition, the components have actually clearly specified APIs and can be accessed with modern, industry-standard protocols. Components can remain on a mainframe such as COBOL, PL/I, or 100% natural programs, or be re-deployed into modern, dispersed environments, such as Java 2 or.NET.

As part of the transformation process, a special course of elements that exist in every system needs to be identified. These parts perform usual system energy functions such as mistake reporting, deal logging, and date-calculation regimens, and typically work at a lower degree of abstraction than business components. To avoid processing redundancy and to guarantee consistency in system habits, these parts need to be standardized into a system-wide recyclable energy library.

When choosing a migration device, organizations need to consider the quality of the created code. Tools that map every component in the legacy language to a code equivalent in the target language can be a significant time saver. Designers who are professionals in the heritage language will discover it simpler to comprehend the generated code if it makes up representations of the legacy code’s language and structure.

In addition, companies could discover it more hassle-free to separate the conversion procedure into two actions. The very first is the translation of existing code, information migration, and connected screening; the second is the addition of brand-new functionality. Prior to making any changes to program reasoning and structure, companies ought to initially test the end-result of the migration process for practical equivalence with the original legacy application.

 

Conclusion

Regardless of the challenges, legacy modernization is crucial for organizations spending too much to keep business value of their obsoleted information systems. Also driving the requirement for modification is the industry’s activity toward brand-new Internet-based platforms, such as.NET and J2EE. These new computing paradigms take advantage of a component-based, distributed computing design efficient in automating business procedures inside or with partners via Web services. Adopting more recent computing systems can cut operating expense and make it easier to adjust an IS to market changes or competitive pressure.

A variety of alternatives exist, including changing the system with a packaged application or non-intrusive measures such as display scraping and code wrapping. Each of the methods makes good sense under particular circumstances. The latter techniques supply fast and affordable access to legacy performance, while the former can get rid of legacy applications where the code quality is too poor to migrate. But for those business planning to preserve and extend the functionality of their older system on a modern platform, legacy transformation making use of today’s migration tools may be the most cost-effective technique.
Credits To: http://www.developer.com/
VizTeams has over 300 experts with the history of successfuly delivering over 500 projects. VizTeams serves cllient inside North America specifically USA and Canada while physically serving clients in the cities of Seattle, Toronto, Buffalo, Ottawa, Monreal, London, Kitchener, Windsor, Detroit. Feel free to contact us or Drop us a note for any help or assistance.

 

Drop Us A Note

[gravityform id=”2″ name=”Drop us a Note” title=”false” description=”false” ajax=”true”]

Post a comment