High Availability Requirements for Mainframe Data Modernization — Running tcVISION in Global Availability Zones

by Joseph Brady, Director of Business Development / Cloud Alliance Lead at Treehouse Software, Inc.

Many customers embarking on Mainframe-to-Cloud data replication projects with Treehouse Software are looking at high availability (HA) as a key consideration in the planning process. The goal with HA is to ensure that systems are always functioning and accessible, with deployments located in various Availability Zones (AZs) worldwide. Having an HA architecture in place protects against data center, availability zone, server, network, and storage subsystem failures to keep businesses running without downtime or human intervention.

In this blog, we will give a high-level overview of tcVISION HA Architecture, using AWS as an example. However, HA basic principles are essentially the same across all Cloud platforms.

Example of the tcVISON HA Architecture on AWS

During tcVISION’s Change Data Capture (CDC) processing for Mainframe-to-Cloud data replication, HA must be maintained in the AWS environment. The Amazon Elastic Compute Cloud (Amazon EC2), which contains the tcVISION Manager, is part of an Auto Scaling Group that is spread across AZs with Amazon EC2 instance(s).

___tcVISION_AWS_HA_Architecture

Upon failure, the replacement Amazon EC2 instance tcVISION Manager is launched and communicates its IP address to the mainframe tcVISION Manager. The mainframe tcVISION Manager then starts communication with the replacement Amazon EC2 tcVISION Manager.

Once the Amazon EC2 tcVISION Manager is restarted, it continues processing at its next logical restart point, using a combination of the LUW and Restart files. LUW files contain committed data transactions not yet applied to the target database. Restart files contain a pointer to the last captured and committed transaction and queued uncommitted CDC data. Both file types are stored on a highly available data store, such as Amazon Elastic File System (EFS).

For production workloads, Treehouse Software recommends turning on Multi-AZ target and metadata databases.

To keep all the dynamic data in an HA architecture, tcVISION uses EFS, which provides a simple, scalable, fully managed elastic file system for use with AWS Cloud services and on-premises resources. It is built to scale on-demand to petabytes without disrupting applications, growing and shrinking automatically as you add and remove files, eliminating the need to provision and manage capacity to accommodate growth.


Treehouse Software can help organizations immediately start moving their mainframe data to the Cloud and take advantage of the most advanced, scalable, secure, and highly available technologies in the world with tcVISION

tcVISION_Overall_Diagram_General_Cloud

tcVISION supports a vast array of integration scenarios throughout the enterprise, providing easy and fast data migration for mainframe application modernization projects and enabling bi-directional data replication between mainframe, Cloud, Open Systems, Linux, Unix, and Windows platforms.

View the Unequalled List of Environments Supported by tcVISION Here


__TSI_LOGO

___AWS_Select_Partner_Badge ___Google_Cloud_Partner_Badge

Contact Treehouse Software for a Demo Today…

Just fill out the Treehouse Software Product Demonstration Request Form and a Treehouse representative will contact you to set up a time for your tcVISION demonstration. This will be a live, on-line demonstration that shows tcVISION replicating data from the mainframe to a Cloud target database.

Now, more than ever, enterprises with mainframes are looking to modernize their legacy systems

by Joseph Brady, Director of Business Development / Cloud Alliance Lead at Treehouse Software, Inc.

Rapidly changing global health, economic, and political conditions are making fast access to the most current information more important than ever for official agencies and the public.  As a result, modernizing information systems is taking center stage and top priority, especially for organizations with critical mainframe data residing on a variety of long-standing databases, often still updated by COBOL applications! These databases include Db2, VSAM, IMS/DB, Adabas, IDMS, Datacom, or even sequential files. Unlocking the value of this important data can be difficult, because the data can be utilized by numerous interlinked and dependent programs that have been in place for many years, and sometimes decades.

Many organizations are now looking for modernization solutions that allow their legacy mainframe environments to continue, while replicating data in real time on highly available Cloud-based platforms (AWS, Google Cloud, Microsoft Azure, etc.). With a “data-first” approach, immediate data replication to the Cloud is enabling government, healthcare, supply chain, financial, and a variety of public service organizations to meet spikes in demand for vital information, especially in times of crisis.

Treehouse Software can help organizations immediately start moving their mainframe data to the Cloud and take advantage of the most advanced technologies in the world with tcVISION

tcVISION_Overall_Diagram_General_Cloud

Whether an enterprise needs to take advantage of the latest Cloud services, such as big data analytics, artificial intelligence (AI), rapid global database deployments, high-level security, etc., or move data to a variety of newer Cloud or Open Systems databases, the transition doesn’t have to be a sudden big bang.

A Phased Approach

Treehouse Software has extensive mainframe experience and subject matter experts to help organizations incrementally replicate their mainframe data to the Cloud and other modern systems, while keeping both sides synchronized.

Treehouse Software’s expert technical representatives help customers develop a phased plan that includes installation and implementation of the tcVISION mainframe data modernization product, script customization, data replication mapping, high availability, security, monitoring, training, etc..

After defining the architecture, the production deployment phase begins with incremental sprint-like deployments. Additional files are then deployed into production regularly.

This phased plan enables tcVISION to synchronize critical mainframe data to a Cloud / Open Systems database. Bi-directional, real-time data synchronization allows changes on either platform to be reflected on the other platform (e.g., a change to a PostgreSQL table is reflected back on mainframe). The customer can then modernize their application on the cloud, open systems, etc. without disrupting the existing critical work on the legacy system.

Additionally, tcVISION customers see drastically reduced mainframe MIPS costs, and increased ability to quickly respond to business environment changes.

Enterprise ETL and Real-time and Bi-directional Data Replication Through Change Data Capture with tcVISION

tcVISION uses an intuitive Windows GUI interface for administration, mapping and modeling, script generation, and monitoring. The product focuses on changed data capture (CDC) when transferring information between mainframe data sources and modern databases and applications. Through an innovative technology, changes occurring in any mainframe application data are tracked and captured, and then published to a variety of targets.

tcVISION – Supported Sources and Targets

tcVISION supports a vast array of integration scenarios throughout the enterprise, providing easy and fast data migration for mainframe application modernization projects and enabling bi-directional data replication between mainframe, Cloud, Open Systems, Linux, Unix, and Windows platforms.

View the Unequalled List of Environments Supported by tcVISION Here


__TSI_LOGO

___AWS_Select_Partner_Badge ___Google_Cloud_Partner_Badge

Contact Treehouse Software for a Demo Today…

Just fill out the Treehouse Software Product Demonstration Request Form and a Treehouse representative will contact you to set up a time for your tcVISION demonstration. This will be a live, on-line demonstration that shows tcVISION replicating data from the mainframe to a Cloud target database.