Modern Approach to Migration of Disparate Enterprise Data

Brewed in Percolator

by Joseph Brady, Marketing and Documentation Manager for Treehouse Software

I came across an article recently in the Data Migration Pro Journal, “Is it R.I.P for the Big-Bang Data Migration?” Given this site’s deep dive into data migration matters, I was intrigued with not only the title, but its resounding message being so closely connected to what Treehouse Software has been preaching for years: The traditional strategy of parallel migration between the host and the target may be failsafe (you can always go back to the original system) but with the vast amounts of data that are usually involved and myriad complications that can potentially delay or completely disrupt the operation, the endeavor usually risks a failed migration project.

Since the mid-1990s, Treehouse has dominated the ADABAS-to-RDBMS data migration and integration market, with its proven and powerful ETL, CDC, and real-time replication solutions. More recently, the addition of expanded capabilities enables migration and integration of virtually any mainframe database or data source. In short, we can connect your enterprise—from anything to anything.

The combined power of our technology resources and know-how reduces cost and mitigates risk in mainframe IT project initiatives, where data migration and integration complexity is often underestimated, yet critical to success.

Efficient, standards-based, and automated approaches such as those provided by Treehouse’s products can be implemented in a fraction of the time of, and much more cost-effectively and reliably than, manual efforts. Treehouse Software’s data migration, replication and integration solutions are intended for customers needing to access, integrate, replicate, and migrate data from Software AG’s ADABAS and other mainframe data sources. Such data sources often require migration or replication to relational database systems such as Oracle, Microsoft SQL Server, IBM’s DB2, Teradata, and Sybase.

To extend its reach to mainframe data sources such as VSAM, IMS, CA-IDMS, CA-Datacom, DL/I, DB2, and even sequential files, Treehouse teams with B.O.S. Software Service und Vertrieb GmbH of Haar, Germany, to distribute and support B.O.S.’s tcVISION and tcACCESS, mainframe data replication and data access products. These offerings provide the ability to deliver anywhere-to-anywhere data replication and integration, as well as direct SQL-based access to non-relational mainframe data structures.

tcACCESS is a comprehensive software solution that enables two-way integration between IBM mainframe systems and client/server, Web and SOA technologies–without the need for mainframe knowledge or programming effort. tcACCESS is a proven platform that facilitates SQL-based integration of mainframe data sources and programs into open-systems and Windows applications using industry standards such as SQL, ODBC, JDBC, and .NET. SQL queries that access mainframe data can be easily created using drag and drop techniques—no programming required. The results of queries can be immediately presented and viewed (i.e., in Microsoft Excel, Microsoft Access, etc.). Direct ODBC or JDBC access to mainframe data from any client/server or Web application can be easily implemented; it is necessary only to assign the tcACCESS driver to the application.

The tcVISION solution focuses on CDC when transferring information between mainframe data sources and Windows or open-systems databases and applications. Through an innovative technology, changes occurring in any mainframe application data are tracked and captured, and then published to a variety of RDBMSs and other targets. tcVISION’s capture facilities detect changes in mainframe data sources without programming effort, and reduce the amount of data that must be transferred between systems to an absolute minimum. tcVISION guarantees transparent, efficient and auditable data transfer between sources and targets, and provides powerful routines to perform efficient, reliable bulk transfers of data with great success.

Treehouse can help your organization with tools and expertise for the riskiest and most-often overlooked parts of mainframe modernization and integration projects—the data migration and integration. Leveraging our products and services will eliminate reliance on your programming staff to write and maintain data extracts and middleware and enable deployment of powerful, robust and secure data migration and replication implementations.

Read our case studies and let us know how we can help you make your next migration project a sure-fire success.

Cloud-y … with a 100 Percent Chance of Data

cloud computing and downloading

by Wayne Lashley, Chief Business Development Officer for Treehouse Software

Along with three of my colleagues, I recently participated in the Treehouse exhibit at the Gartner Application Architecture, Development and Integration (AADI) event in Las Vegas. This is a conference where we have exhibited in the past, and I personally have attended several other times. In fact, I just learned that the DI in AADI no longer stands for “Data Integration”; this change was only made in the past couple of years, as in the past it was the data integration aspect that made the show particularly relevant to Treehouse.

Though data integration vendors such as Informatica, Pervasive and Adeptia—and Treehouse—were in attendance, their numbers seemed diminished over prior years. And while “Legacy Modernization” had an entire subject “track” a couple of years ago, a number of “name” LM vendors were notably absent this year, and the topic was only rarely represented in sessions.

But there was a predominant theme at the event, and its name is Cloud.

People have been talking about “Cloud” for years already, and it is a well-established concept with many dimensions and extensive implementations. And it’s probably familiar enough to The Branches readers that I won’t waste words describing it, other than to say that it is simply a way to offer computing services via the Internet without the subscriber—most Cloud offerings are subscription-based—knowing or caring what or where the physical implementation is.

Many people consider that Salesforce.com is the granddaddy of all Cloud services, and to my mind it popularized the term “Software as a Service” (SaaS). Evolutionary Technologies, Inc. (ETI), a long-standing player in the data integration field and a company that I have had a lot of contact with over the years, reinvented itself around 2005 as a SaaS company, in doing so placing the company on the leading edge of “aaS” providers and essentially defining an entirely new market space.

These days there are a number of other “aaS” genres competing for mindshare and dollars, the most dominant being “Platform as a Service” (PaaS). Once again, Salesforce.com seemed to define the space initially, but others such as Amazon and Google have since come to dominate it. Just this week I was invited to an event for Oracle partners where Oracle executives will present their concept for an Oracle Cloud PaaS. I recall a Microsoft Worldwide Partners Conference (WWPC) a couple of years ago where Microsoft kicked off its Azure Cloud platform. You don’t have to install Microsoft Office on your PC anymore; Office 365 runs in the Cloud.

Even legacy applications are getting the Cloud treatment: a company called Heirloom Computing has commenced offering a platform for running legacy COBOL applications in the Cloud.

Cloud has also entered popular culture and commodity services. There’s a TV commercial that I keep seeing advertising a Cloud-based service that automatically troubleshoots, tunes up and cleans up your PC.
In short, you’re nobody if you’re not in the Cloud.

D-for-“Data” may have morphed into D-for-“Development” in the AADI Summit name, but data replication, integration and migration remain very relevant in the Cloud age. Indeed, you can’t spell Cloud without a D.

To support provisioning of Cloud-based applications, there has to be a means for getting data from where it is now—often in mainframe-based legacy databases or relational databases on open systems, within a company’s internal IT infrastructure—to the Cloud facilities, be they public or private. This doesn’t happen by magic. We have recently been working in a customer implementation where Oracle and DB2 data are being replicated bidirectionally in a Cloud implementation using our tcVISION solution. Such a scenario posed a bit of a challenge for us in terms of licensing: the machines on which tcVISION is installed are not specifically known at a given point in time. So we had to adapt our licensing model to accommodate the new reality.

We expect to see continued growth and demand for our replication and integration solutions as Cloud offerings evolve and expand. Furthermore, we are working on a new Cloud-oriented solution in collaboration with Cloud platform providers. I have briefed several Gartner analysts on it, and their feedback has been encouraging. Check back to this space regularly for news on this exciting new Treehouse offering.

A Data Replication Solution that Hits theTarget (and Source)

by Chris Rudolph, Senior Technical Representative for Treehouse Software and Joseph Brady, Marketing and Documentation Manager for Treehouse Software

Today’s IT organization is dealing with any number of challenges, including heterogeneous environments, legacy applications, high-availability information systems, data silos, increasing data volumes, and escalating costs. These factors make it imperative that IT departments find cost-effective solutions for enterprise-wide data management, preferably using intelligent data integration and efficient data synchronization.

Data exchange in a heterogeneous IT infrastructure means harmonization of different data formats and data models for data exchange solutions. Very often, this data exchange is a complex and tedious task that represents a major cost factor. Data exchange is also time-sensitive and critical, hence reliability and auditability of all data movements is important.

Since the mid-1990s, Treehouse Software has dominated the ADABAS-to-RDBMS data migration and integration market, with its proven and powerful ETL, CDC, and real-time replication solutions. The addition of two mainframe integration products: tcACCESS and tcVISION, enable the migration of virtually any mainframe database or data source in a cost-effective manner.

Enterprise data integration with tcACCESS provides a powerful integration platform (mainframe software, workstation software, middleware) for IBM mainframes, enabling transparent integration of mainframe data sources and programs into open-systems applications using industry standards like SQL, ODBC, JDBC, and .NET. A modular software solution, tcACCESS, comprises a base system that can either be implemented as a CICS transaction or as a VTAM application, and provides its own communication modules. The heart of the system is the tcACCESS SQL Engine, which allows access to mainframe data sources using SQL statements, and features:

  • Bi-directional data-exchange across heterogeneous systems
  • Direct data access across heterogeneous systems
  • Data transformation for data analysis and exchange
  • Relational access to legacy data and applications
  • Data federation – heterogeneous data views
  • Integration of mainframe files and DBMS structures
  • Data federation between mainframe and Windows/Open Systems data

CDC Replication with tcVISION

tcVISION supports data exchange between mainframe-based systems like IMS/DB and DB2, between mainframe and open-systems servers like DB2 and Oracle, as well as within an open-systems environment (e.g., DB2 LUW to SQL Server or Oracle).

tcVISION’s unmatched array of CDC methods includes “Loopback Suppression” for bi-directional updates, ensuring data integrity and replication efficiency between the data sources and targets so that changes received from a source system and applied to the target are not unintentionally propagated back.

The Problems:

  • Different data formats
  • Different data models
  • Large data volumes
  • Limited batch window
  • Extract programs are costly to run in a chargeback environment
  • Requirement for up-to-date information

The Solution:

Moving/replicating data…

  • as much as needed
  • as little as possible
  • as transparent as possible
  • as flexible as possible
  • as secure as possible
  • using as little mainframe CPU as possible with tcVISION

tcVISION is a flexible data replication product that focuses on changed data from virtually any mainframe data source and transfers information between mainframe and workstations or open systems–in bulk, either through batch Changed Data Capture (CDC) or in real time. Mainframe data exchange processes are considerably simplified using tcVISION. The structure of the existing mainframe data is analyzed by special processors, and the data mapping information is presented in a user-friendly and transparent format – even for users with no mainframe knowledge–and captured in a metadata repository.

tcVISION’s unique “stage processing” architecture allows for most (and in some cases ALL) of the ETL  and CDC processing to be performed on a Windows, UNIX, or Linux platform. This is an especially attractive feature for  sites that are already at 100% utilization of their mainframe CPU, or have a chargeback system in place.

tcVISION’s Windows-based Control Board provides an easy-to-use facility to configure and administer the data flow. tcVISION provides a variety of interfaces to allow seamless integration with ETL or EAI solutions.

DataReplicationSolution_tcVISION

Case in Point: Mainframe Data Synchronization at Blum

Blum manufactures high-quality fittings systems for kitchens and home furnishings with production plants in Austria, Poland, Brazil and the USA.  Their myriad replication challenges included continuous and bi-directional data replication between headquarters and subsidiaries, uni-directional replications between the mainframe and the Oracle Data Warehouse.

The Problem:

  • Phase-in of new applications required parallel processing

The Solution:

  • Bidirectional, real-time synchronization of DB2 and DL/I databases across LPARs using tcVISION DBMS Extension
  • Results:
  • Applications can coexist indefinitely
  • Projects gain needed flexibility
  • Record types easily handled

To learn more about tcACCESS and tcVISION or read additional Case Studies here.

Are You a Data Hoarder?

DataHoarder

by Joseph Brady, Marketing and Documentation Manager for Treehouse Software

I moved recently. I’ve heard it said that moving is one of life’s most stressful experiences…right up there with death. I would have to agree.

At some point during the process of evaluating every item I owned to determine whether to keep, toss or donate, it occurred to me that I could be the next star of The Hoarders reality show. It hit me especially hard when we unearthed the boxes that were tucked away under the staircase storage space that contained documents (or other essentials like my son’s artwork from kindergarten) that I hadn’t unpacked — let alone touched — since my last move over seven years ago. Over the years, I had accumulated a houseful of stuff — all of which at some point in time had been essential.

It took weeks to accomplish packing it all. Each item that I was going to move to the new place had to be carefully wrapped, boxed, and labeled, minimizing any breakage or losses and the amount of time the movers would have to spend on the other side. Not only would this potentially save us hundreds of dollars, but would expedite the unpacking and ensure my stuff in the new digs would be well organized and clutter-free. Looking back, this was time well spent.

So what does this have to do with your data?

Just as stuff inside a home can be hoarded, so can data! One of the long-running complaints about corporate IT is how data gets “siloed”, which constrains organizations in acting across internal boundaries. Enterprises generate huge amounts of structured and unstructured data stored in various data repositories across the organization and in production systems. There are many reasons behind data hoarding, but if you have a data archiving strategy, you can ensure that inactive data — especially that which may be inside legacy systems applications  — is stored, managed, secured or destroyed and that the data you keep can be accessed for any reason at any time.

For the purposes of this article, let’s talk about inactive data that is rarely or lightly used (or may not be used at all except on occasion when someone needs to look something up). While there may be myriad methods to access this data, including writing customized SQL queries, these may vary greatly in quality and sharability and you may need training to support them. Maintaining a mixed bag of individually knocked out SQL queries may prove to be a headache. What is needed is an agile, simple solution.

Introducing DataNovata.

DataNovata is a Rapid Application Development tool that instantly generates a secure, read-only, web-based application to query and navigate any database in any conceivable way. Simple focused and cost-effective tool, DataNovata is a perfect way to get a standard architecture in place as applications are being retired or decommissioned while the data still needs to be available for various purposes—regulatory/statutory, audit, historical analysis, etc.

The feature-rich, web-enabled applications generated by DataNovata are suitable for end users and give them powerful data interrogation facilities to facilitate finding the information they need quickly—with minimal training and technical expertise required. Organizations are realizing enormous cost savings by leveraging the power of DataNovata.

DataNovata is used for a variety of purposes such as:

  • Application Retirement: DataNovata can replace the enquiry facilities of an application should it need to be decommissioned or retired, e.g. superfluous to requirements, no longer supported, expensive to run, obsolete platform.
  • Archived Data Management: DataNovata can provide instant access to all your archived data, reducing the costs of supporting old and expensive platforms.
  • Enhance Access to Mainframe Data: Using the complementary product tcACCESS, DataNovata can be used to give access to data held in a variety of mainframe data sources including DB2, IMS/DB, CA-IDMS, CA-Datacom, VSAM and ADABAS.
  • Satisfy Legal Requirements: DataNovata can satisfy statutory legal requirements with regard to data retention and provide an automated deletion process for purging end-of-life data.
  • Platform and Application Rationalization: DataNovata can assist in streamlining your IT infrastructure.
  • Applications Portfolio Management: DataNovata provides a solution to the application redundancy and upgrade issues created by mergers and acquisitions, eliminating the associated licensing, maintenance and hardware costs.
  • Information Lifecycle Management: DataNovata can be central to your Information Lifecycle Management policy, responding to requests, retrieving data, providing access to authorized users and handling infrequently accessed information.
  • Forensic Analysis: DataNovata can become a powerful forensic tool for the detection of fraud, as the generated application follows the relationships within the data not necessarily used by the original application.
  • Analytical Databases: DataNovata provides the perfect user interface for analytical databases containing petabytes of data, even where there are no relationships defined in the database.
  • Application Renovation: DataNovata can renovate the user interface of an aging application with a modern, intuitive and feature-rich front-end, leaving any off-line processing unaffected.
  • Testing Facility: DataNovata can be an invaluable testing facility for application development or maintenance, where an independent but structured view of the data would be useful for verifying database operations.

Because of its universal use, DataNovata does not sit in any specific vertical market niche. However, its ability to provide users access to legacy data makes it well-suited for use by the financial, pensions, banking and insurance industries.

Now, if there were only something as easy as DataNovata that would help me unpack those last miscellaneous boxes, I could have my housewarming party.