Treehouse Software Eases the Challenges for Mainframe-to-Modern Platform Transformation Programs for Systems Integrators

Unlock Your Customers’ Business Value Faster with Frequent Production Deliveries, Using Agile Methods

Treehouse Software’s product, tcVISION has proven to be vital for the ease and success of many mainframe modernization initiatives.

A big problem for a Systems Integrator (SI), or any mainframe modernization vendor, is the myriad of dependencies between mainframe and open systems data and components. As part of a mainframe modernization project, these vendors need to determine the ordering of phases or waves of applications to be migrated. There are often shared databases that can make it very difficult to move one set of applications without moving others. This can result in the conclusion that they must go with a “big bang” approach, which can lead to a bid for a program with one large delivery at the end with all of the integrated components in place. This is very unappealing to customers as they are paying out big dollars without the corresponding business value of a production deployment. This approach also significantly increases the project risk.

___Mainframe_To_LUW

With Treehouse Software’s bi-directional replication product, tcVISION, a subset of applications can be migrated, and tcVISION’s data synchronization capabilities can be used to bridge specific shared legacy mainframe databases to the modern database versions on open systems. This allows them to deliver in phases, thus getting modernized modules in production faster. In addition, it decreases risk as issues can be found much faster with the more frequent production deployments. When the last modules are deployed to production, they would no longer need the synchronization.

As an example, a large Treehouse customer is currently involved in gradually modernizing and replacing a set of applications, and is using a similar approach. The customer will be using their mainframe Adabas database synced with Oracle on Linux, while in the process of modernizing the applications. Treehouse Software’s experts and tcVISION product are making sure the customer’s transition goes smoothly and seamlessly.


Enterprise ETL and Real-Time Data Replication Between Virtually Any Source and Target

tcVISION supports a vast array of integration scenarios throughout the enterprise, providing easy and fast data migration for mainframe application modernization projects and enabling bi-directional data replication between mainframe, Linux, Unix and Windows platforms. This innovative technology offers comprehensive abilities to identify and capture changes occurring in mainframe and relational databases, then publish the required information to an impressive variety of targets, both on-premise and Cloud-based.

_0_tcvision_connection_overview

tcVISION acquires data in bulk or via change data capture methods, including in real time, from virtually any IBM mainframe data source (Software AG Adabas, IBM DB2, IBM VSAM, IBM IMS/DB, CA IDMS, CA Datacom, sequential files), and transform and deliver to virtually any target. In addition, the same product can extract and replicate data from a variety of non-mainframe sources, including Adabas LUW, Hadoop/HDFS, MongoDB, Oracle Database, Microsoft SQL Server, IBM DB2 LUW and DB2 BLU, IBM Informix, PostgreSQL, etc.


__tsi_logo_400x200

Visit the Treehouse Software website for more information on tcVISION, or contact us to discuss your needs.

A Well-Earned (Application) Retirement

Guest blogger Howard Sherrington, CEO of NSC LegacyData Solutions Ltd., developers of DataNovata, discusses how retiring legacy applications can cut IT costs and free up resources – if the right approach is taken to accessing the legacy data.

The one constant in business IT is that yesterday’s new systems will become tomorrow’s legacy.  As organizations evolve and new technologies emerge, IT departments have to deal with the impact of this constant evolution and change on business operations.

But change events are difficult to predict.  Mergers or acquisitions can transform even recent deployments into duplicate or legacy systems, as the IT function struggles to keep up with the changing demands of the business.

Legacy systems represent a drain on IT resources, both in terms of cost and manpower.  Industry analyst Gartner conservatively estimates that businesses spend around 10% – 25% of their IT budget in supporting and managing legacy systems – and I believe that this can rise to as much as 35% in some organizations. This is especially true for organizations that are running old and complex systems and applications.

So, given the current business climate, where capital for new projects is harder to come by and operational expenditure for existing systems under close scrutiny, it’s prudent to look for more efficient ways of dealing with these older applications than to simply keeping them running.

It’s all about the data

In these circumstances, it’s essential to establish why the legacy system is being maintained.  In the vast majority of cases, it’s because of the data held in the system or application.

I frequently hear the statement, “No, we can’t do anything with that service, as we still need it”, even though nine times out of 10 it’s simply access to the data that’s required, rather than the application itself.  This also correlates with analyst Forrester’s estimation that nearly 85% of data in databases is inactive, simply being stored for subsequent access rather than being processed.

This is often the case for financial systems, especially in pensions and investment management, which usually lie dormant – at a considerable cost, as we saw earlier – so the business has access to the legacy data for legal, taxation, due diligence or compliance purposes.

A retirement bonus

So why not retain the vital elements and let go of the redundant parts?  By separating the legacy data that the business needs from the legacy system, and then decommissioning the applications and platforms, a business could make substantial savings in budgets, support and resource commitment.

There’s also the opportunity to increasing the efficiency of operations by giving staff wider, more flexible access to the legacy data.  Let’s take a closer look at how application retirement should be approached and managed, and the benefits it can offer your business.

Migration matters

The initial issue in application retirement is the migration of the data from the legacy application or platform.  Exactly how this is done will depend on several factors, including the type and age of the application and platform, and how the data is stored.  However, there are a couple of key ‘best practice’ points which should be observed in any migration project.

First, reduce the risk of data loss or damage by testing your procedures.  Make sure you have a backup copy of the data before trying a migration, and if possible, pilot the process using a small subset of the data.  Then you can compare the extracted data with the original to ensure the process isn’t changing the data in any way.

Second, ensure the data is migrated into a database format that is accessible by the widest range of applications, and that can run on low-cost, flexible computing platforms – ideally a structured, relational SQL-compliant database.  This helps to ensure flexible, open access to the data for a range of different user types.  Data structuring tools are available to simplify migrations to relational databases.

Access all areas
Once the migration is complete, the focus should be on how users will access the data: on building the applications that will support easy, flexible but robust data access.  The key to this is to use a tool that takes advantage of the open, Web model to run on any hardware and operating system at both the server and client side, delivering customizable data views and queries within a browser-based interface. 

This gives even non-technical users uniform access to data migrated from legacy systems from a familiar point-and-click interface – minimizing the need for user training.  It also helps organizations avoid ongoing licensing, maintenance and hardware costs for access to legacy data, and can give access to data over the Web from any location.

This makes application retirement a more efficient and cost-effective way of dealing with legacy systems, compared to other alternatives such as modernizing the application – which may mean expensive hardware updates, terminal emulation and so on – or transferring to a virtualized environment, which can carry a significant penalty in migration costs and ongoing management.

Considering the benefits

By retiring legacy systems, considerable benefits can be realized.  Firstly, analysts estimate that payback of outlay is often less than 12 months and the total ROI over three years usually exceeds 150%.

What’s more, by decommissioning your legacy systems, your IT team can focus on more strategic tasks than maintenance and support for old platforms.  There are also benefits such as reduced risk due to outages, acceleration of new product initiatives due to fewer integration or support issues, as well as a more streamlined disaster recovery plan.

So with the benefits that application retirement can offer, letting go of your IT department’s past while preserving the business information could make a key impact on your operations.  Applications and platforms will come and go, but data is forever.