Treehouse Software Eases the Challenges for Mainframe-to-Modern Platform Transformation Programs for Systems Integrators

Unlock Your Customers’ Business Value Faster with Frequent Production Deliveries, Using Agile Methods

Treehouse Software’s product, tcVISION has proven to be vital for the ease and success of many mainframe modernization initiatives.

A big problem for a Systems Integrator (SI), or any mainframe modernization vendor, is the myriad of dependencies between mainframe and open systems data and components. As part of a mainframe modernization project, these vendors need to determine the ordering of phases or waves of applications to be migrated. There are often shared databases that can make it very difficult to move one set of applications without moving others. This can result in the conclusion that they must go with a “big bang” approach, which can lead to a bid for a program with one large delivery at the end with all of the integrated components in place. This is very unappealing to customers as they are paying out big dollars without the corresponding business value of a production deployment. This approach also significantly increases the project risk.

___Mainframe_To_LUW

With Treehouse Software’s bi-directional replication product, tcVISION, a subset of applications can be migrated, and tcVISION’s data synchronization capabilities can be used to bridge specific shared legacy mainframe databases to the modern database versions on open systems. This allows them to deliver in phases, thus getting modernized modules in production faster. In addition, it decreases risk as issues can be found much faster with the more frequent production deployments. When the last modules are deployed to production, they would no longer need the synchronization.

As an example, a large Treehouse customer is currently involved in gradually modernizing and replacing a set of applications, and is using a similar approach. The customer will be using their mainframe Adabas database synced with Oracle on Linux, while in the process of modernizing the applications. Treehouse Software’s experts and tcVISION product are making sure the customer’s transition goes smoothly and seamlessly.


Enterprise ETL and Real-Time Data Replication Between Virtually Any Source and Target

tcVISION supports a vast array of integration scenarios throughout the enterprise, providing easy and fast data migration for mainframe application modernization projects and enabling bi-directional data replication between mainframe, Linux, Unix and Windows platforms. This innovative technology offers comprehensive abilities to identify and capture changes occurring in mainframe and relational databases, then publish the required information to an impressive variety of targets, both on-premise and Cloud-based.

_0_tcvision_connection_overview

tcVISION acquires data in bulk or via change data capture methods, including in real time, from virtually any IBM mainframe data source (Software AG Adabas, IBM DB2, IBM VSAM, IBM IMS/DB, CA IDMS, CA Datacom, sequential files), and transform and deliver to virtually any target. In addition, the same product can extract and replicate data from a variety of non-mainframe sources, including Adabas LUW, Hadoop/HDFS, MongoDB, Oracle Database, Microsoft SQL Server, IBM DB2 LUW and DB2 BLU, IBM Informix, PostgreSQL, etc.


__tsi_logo_400x200

Visit the Treehouse Software website for more information on tcVISION, or contact us to discuss your needs.

TREETIP: Integrate Mainframe Data Sources In Your Big Data Initiatives

tcVISION supports a vast array of integration scenarios throughout the enterprise, providing easy and fast data migration for mainframe application modernization projects and enabling bi-directional data replication between mainframe, Linux, Unix and Windows platforms. This innovative technology offers comprehensive abilities to identify and capture changes occurring in mainframe and relational databases, then publish the required information to an impressive variety of targets, both on-premise and Cloud-based.

Analysts have observed that perhaps 80 percent of the world’s corporate data still resides on mainframes. So it’s no surprise that Bloor Research (http://www.bloorresearch.com/research/spotlight/big-data-and-the-mainframe/), notes that “it is necessary today to place the mainframe as a ‘first-class player’ in any enterprise Big Data strategy.”

In February 2017 we highlighted tcVISION’s support for replication to the leading NoSQL database MongoDB. MongoDB continues to increase in popularity as a back end for operational applications with real-time requirements.

tcVISION also supports analytics and “mainframe offload” Big Data use cases that generally leverage Hadoop HDFS and/or streaming data transport. With tcVISION, data from a wide variety of IBM mainframe data source can be quickly and easily replicated to Big Data targets, requiring minimal mainframe know-how and having minimal impact on the mainframe.

___tcVISON_Big_Data_001

Boost the return on investment for your Big Data initiatives using tcVISION!


Find out more about tcVISION — Enterprise ETL and Real-Time Data Replication Through Change Data Capture

tcVISION provides easy and fast data migration for mainframe application modernization projects and enables bi-directional data replication between mainframe, Linux, Unix and Windows platforms.

_0_tcVISION_Simple_Diagram

tcVISION acquires data in bulk or via change data capture methods, including in real time, from virtually any IBM mainframe data source (Software AG Adabas, IBM DB2, IBM VSAM, IBM IMS/DB, CA IDMS, CA Datacom, even sequential files), and transform and deliver to virtually any target. In addition, the same product can extract and replicate data from a variety of non-mainframe sources, including Adabas LUW, Oracle Database, Microsoft SQL Server, IBM DB2 LUW and DB2 BLU, IBM Informix, and PostgreSQL.


__TSI_LOGO

Visit the Treehouse Software website for more information on tcVISION, or contact us to discuss your needs.

Two Local Technology Companies Partner to Advance Cognitive Computing; Complementary Areas of Expertise Mean Better Data Integration and Individualization

Treehouse Software, Inc., of Sewickley, PA and Cognistx of Pittsburgh, PA announced a partnership to help customers with improved data integration and individualization to fully leverage the power of cognitive computing.

Technology industry leaders from Accenture to Gartner to McKinsey recognize the future of computing will be cognitive, calling it a disruptive force and estimating the industry to reach $200 billion by 2020. Cognitive computing is based on leading edge technology including artificial intelligence, natural language processing, Big Data, advanced analytics and machine learning algorithms.

cog_push_med2

The Treehouse – Cognistx partnership will allow customers to ingest massive amounts of data, whether that data is numbers, images, or audio files, and mine it to find insights that lead to action, and ultimately to increased revenue from improved customer engagement.

Since the mid-1990s, Treehouse Software has been a global leader in mainframe data migration, replication and integration, offering robust and flexible solutions for ETL, CDC and real-time, multidirectional replication between databases on various platforms.

Cognistx is an applied technology company harnessing state-of-the-art cognitive computing tools to help retailers reach individuals with intuitive, intelligent and individualized offers based on their past transactions, preferences, context and profile.

cog_mobile_tech_image

Cognistx complements Treehouse’s capability to deliver data with its machine learning algorithms that become more accurate with every transaction, delivering customized, personalized, prescriptive actions in the right context. Together, the two companies will co-market their capabilities, bringing new competitive advantages to customers who want to expand the use of their most valuable asset — data.

“We’re excited to partner with Cognistx to bring our world-class enterprise data acquisition capabilities to companies that recognize the massive opportunity cognitive computing represents,” said Wayne Lashley, Treehouse Chief Business Development Officer. “We provide the data foundation and Cognistx translates that data into insights, those insights into customer actions, and those actions into incremental revenue.”

“Few retailers do a good job of marrying technology with a customized customer experience that is tailored to their behaviors and timed according to how they might use a retailer’s offer,” said Sanjay Chopra, CEO of Cognistx. “With our proprietary algorithms and Treehouse’s enterprise data solutions, both our customers win. Only with large amounts of data can our system learn about the consumer and their preferences and how those change in order to deliver only the smartest, most individualized offers.”

About Treehouse Software, Inc.

Privately-held Treehouse Software was founded in 1982, and is a global leader in providing data migration, replication, and integration solutions for the most complex and demanding heterogeneous environments. Treehouse offers a comprehensive and flexible portfolio of software and tools for mainframe platforms, and also includes feature-rich, accelerated-ROI offerings for information delivery, and application modernization. http://www.treehouse.com

About Cognistx

Privately-held Cognistx was founded in 2015 and has a technology hub in Pittsburgh and operations offices in the Innovation Quarter in Winston-Salem, NC, and Raleigh. The company’s co-founders include Sanjay Chopra, a serial technology entrepreneur; Eric Nyberg, professor at Carnegie Mellon University’s School of Computer Science, who consulted with IBM on the Watson project and Jeffrey Battin, former owner of Communefx, a successful data analytics company. Other partners include Florian Metze, professor at Carnegie Mellon University’s School of Computer Science; Jill Zoria, SVP Enterprise Development; Pete Minnelli, SVP Creative; and Karen Barnes, SVP Operations. http://www.cognistx.com

TREETIP: tcVISION Supports Hadoop

by Joseph Brady, Manager of Marketing and Technical Documentation at Treehouse Software, Inc.

tcVISONv6

Hadoop and Big Data are revolutionizing data processing, and because of the increasing digitalization, the Internet, the rising importance of Social Media, and the presence of “Internet of Things”, the data diversity is growing in dimensions that did not exist before.

To process and maintain large and diverse data sets in a meaningful way, new technologies (such as Hadoop) have been developed. What is Hadoop? Hadoop is a free, Java-based programming framework that supports the processing of large data sets in a distributed computing environment. It is part of the Apache project sponsored by the Apache Software Foundation.

Enterprises with heterogenous IT infrastructures, especially larger corporation of all industry sectors and public institutions, very often include mainframe technology. These enterprises are now facing the challenge to integrate existing mainframe data into a Hadoop platform – in real-time.

Data integration technology also has experienced great evolution over the past decades. Today, a standard ETL solution is not sufficient, and the understanding of data integration must now include the entire data exchange process in terms of replication and synchronization. Data exchange is now a time critical process. Near real-time is more and more the only accepted method to meet the high, up-to-date requirements in an increasing co-existence of mainframe and Hadoop technologies.

The tcVISION Solution

An important part of the added value of modern IT systems is the latency-free data- and process-integration of transactional and analytical areas. The cross-system integration platform from Treehouse Software, tcVISION, is unique, efficient, and reliable. With tcVISION, mainframe data can quickly and easily be integrated in near real-time into Hadoop-based operative applications or Business Intelligence and Analytics.

The tcVISION solution is proven and mature, and is constantly under development to meet the requirements of new technologies, including support for Hadoop in Version 6.

The main focus of the tcVISION integration platform is to allow real-time synchronization to integrate mainframe data into Hadoop based solutions.

hadoop_solution

The tcVISION Technology Components

The tcVISION integration platform consists of a variety of state-of-the-art technology components, which cover much more than simply an ETL process.

  1. Data exchange in the sense of a real-time synchronization becomes a single step operation with tcVISION.
  2. No additional middleware is required.
  3. Modern Change Data Capture Technologies allow an efficient selection of the required data from the source system with focus on the changed data. The data exchange process is reduced to the necessary minimum which results in lower costs for the cross-system data integration.
  4. tcVISION also supports the fast and efficient load of large volumes of mainframe data into Hadoop. In this context the processor costs of the mainframe are low and negligible.
  5. An integrated Data Repository guarantees an overall cross-platform and transparent data management. Mainframe knowledge is not required.
  6. tcVISION include a rule-engine to transform data into a target compliant format or allows user-specific processing via supplied APIs.
  7. The integrated staging concept supports the offload of changed data in “Raw Format” to less expensive processor systems. This reduces mainframe processor resources to a minimum. The preparation of the data for the target system can be performed on a less expensive platform (Linux, UNIX or MS-Windows).
  8. The transfer to and feeding of data into Hadoop is part of the tcVISION data exchange process. No intermediate files are required.
  9. The exchange of large volumes of data between a production mainframe environment and Hadoop can run in parallel processes to reduce latencies to a minimum.
  10. The tcVISION integration platform contains comprehensive control mechanisms and monitoring functions for an automated data exchange.
  11. tcVISION has been designed in a way that Hadoop-based projects can be deployed with total project autonomy and maximum reduction of mainframe resources.

With tcVISION, data synchronization between mainframe and Hadoop pays off

  • Near real-time replication of mainframe data to Hadoop allows actual real-time analytics, or the relocation of mainframe applications (i.e., Internet applications like Online-Banking, e-Government, etc.) to Hadoop with synchronous data on both platforms.
  • Because of the concentration on changed data, the costs of the data exchange are greatly reduced.
  • The utilization of mainframe resources is reduced to a level that minimizes costs for mainframe know how and mainframe MIPS.
  • Data exchange processes can be deployed and maintained with tcVISION without mainframe knowledge, hence costs can be saved and Hadoop projects can be faster developed and put into production.
  • The near real-time replication of tcVISION from mainframe to Hadoop allows the relocation of BI reporting and analytic applications to the more cost efficient and – for these applications – more powerful Hadoop platform.

Visit the Treehouse Software website for more information on tcVISION, or contact us to discuss your needs..

Mainframe CDC from Treehouse Software

The globalization of markets, increase of data volumes, and high demand for up-to-date information require new data transfer and exchange solutions for heterogeneous IT architectures, and as many customers have discovered, Treehouse Software has the right product (or combination of products) to meet any conceivable mainframe data migration, replication, or integration requirement. To meet many of these needs, Treehouse Software’s proven and mature tcVISION product moves data – as little as possible – as much as necessary. tcVISION is an innovative software solution that processes changed data in real time, in intervals, or event based.

ChangeDataCapture

The tcVISION solution focuses on changed data capture (CDC) when transferring information between mainframe data sources and LUW databases and applications. Changes occurring in any mainframe application data are tracked and captured, and then published to a variety of RDBMS and other targets.

_0_tcVISION_Simple_Diagram

tcVISION enables bidirectional replication for DB2, Oracle, and SQL Server running on Linux/Unix/Windows, and synchronizes each data source, first by doing a bulk load from source(s) to target and then by replicating only changes— only committed changes—from source(s) to target. So there can never be ambiguity as to whether a query against the target database involves uncommitted data.

Read some tcVISON customer success stories here.


Visit the Treehouse Software website for more information on tcVISION, or contact us to discuss your needs.

 

A Well-Earned (Application) Retirement

Guest blogger Howard Sherrington, CEO of NSC LegacyData Solutions Ltd., developers of DataNovata, discusses how retiring legacy applications can cut IT costs and free up resources – if the right approach is taken to accessing the legacy data.

The one constant in business IT is that yesterday’s new systems will become tomorrow’s legacy.  As organizations evolve and new technologies emerge, IT departments have to deal with the impact of this constant evolution and change on business operations.

But change events are difficult to predict.  Mergers or acquisitions can transform even recent deployments into duplicate or legacy systems, as the IT function struggles to keep up with the changing demands of the business.

Legacy systems represent a drain on IT resources, both in terms of cost and manpower.  Industry analyst Gartner conservatively estimates that businesses spend around 10% – 25% of their IT budget in supporting and managing legacy systems – and I believe that this can rise to as much as 35% in some organizations. This is especially true for organizations that are running old and complex systems and applications.

So, given the current business climate, where capital for new projects is harder to come by and operational expenditure for existing systems under close scrutiny, it’s prudent to look for more efficient ways of dealing with these older applications than to simply keeping them running.

It’s all about the data

In these circumstances, it’s essential to establish why the legacy system is being maintained.  In the vast majority of cases, it’s because of the data held in the system or application.

I frequently hear the statement, “No, we can’t do anything with that service, as we still need it”, even though nine times out of 10 it’s simply access to the data that’s required, rather than the application itself.  This also correlates with analyst Forrester’s estimation that nearly 85% of data in databases is inactive, simply being stored for subsequent access rather than being processed.

This is often the case for financial systems, especially in pensions and investment management, which usually lie dormant – at a considerable cost, as we saw earlier – so the business has access to the legacy data for legal, taxation, due diligence or compliance purposes.

A retirement bonus

So why not retain the vital elements and let go of the redundant parts?  By separating the legacy data that the business needs from the legacy system, and then decommissioning the applications and platforms, a business could make substantial savings in budgets, support and resource commitment.

There’s also the opportunity to increasing the efficiency of operations by giving staff wider, more flexible access to the legacy data.  Let’s take a closer look at how application retirement should be approached and managed, and the benefits it can offer your business.

Migration matters

The initial issue in application retirement is the migration of the data from the legacy application or platform.  Exactly how this is done will depend on several factors, including the type and age of the application and platform, and how the data is stored.  However, there are a couple of key ‘best practice’ points which should be observed in any migration project.

First, reduce the risk of data loss or damage by testing your procedures.  Make sure you have a backup copy of the data before trying a migration, and if possible, pilot the process using a small subset of the data.  Then you can compare the extracted data with the original to ensure the process isn’t changing the data in any way.

Second, ensure the data is migrated into a database format that is accessible by the widest range of applications, and that can run on low-cost, flexible computing platforms – ideally a structured, relational SQL-compliant database.  This helps to ensure flexible, open access to the data for a range of different user types.  Data structuring tools are available to simplify migrations to relational databases.

Access all areas
Once the migration is complete, the focus should be on how users will access the data: on building the applications that will support easy, flexible but robust data access.  The key to this is to use a tool that takes advantage of the open, Web model to run on any hardware and operating system at both the server and client side, delivering customizable data views and queries within a browser-based interface. 

This gives even non-technical users uniform access to data migrated from legacy systems from a familiar point-and-click interface – minimizing the need for user training.  It also helps organizations avoid ongoing licensing, maintenance and hardware costs for access to legacy data, and can give access to data over the Web from any location.

This makes application retirement a more efficient and cost-effective way of dealing with legacy systems, compared to other alternatives such as modernizing the application – which may mean expensive hardware updates, terminal emulation and so on – or transferring to a virtualized environment, which can carry a significant penalty in migration costs and ongoing management.

Considering the benefits

By retiring legacy systems, considerable benefits can be realized.  Firstly, analysts estimate that payback of outlay is often less than 12 months and the total ROI over three years usually exceeds 150%.

What’s more, by decommissioning your legacy systems, your IT team can focus on more strategic tasks than maintenance and support for old platforms.  There are also benefits such as reduced risk due to outages, acceleration of new product initiatives due to fewer integration or support issues, as well as a more streamlined disaster recovery plan.

So with the benefits that application retirement can offer, letting go of your IT department’s past while preserving the business information could make a key impact on your operations.  Applications and platforms will come and go, but data is forever.