TREEHOUSE CUSTOMER UPDATE:

 

 

Image

by Chris Rudolph, Senior Technical Representative for Treehouse Software and Joseph Brady, Marketing and Documentation Manager for Treehouse Software

This is a follow-up to our recent Treehouse Software Blog entry “Treehouse Software is Setting Sights on Many New Data Replication Projects”, in which we described a typical customer visit to implement data replication.

Treehouse representatives were on-site at a state government agency to configure tcVISION and set up bulk transfer and change data capture, as well as train the State employees on using and managing tcVISION. In a subsequent discussion with our contacts at the site, they reported that their deadline for delivering a reporting database in Microsoft SQL Server replicated from 63 ADABAS files has been met. They also happily noted that by using tcVISION, the bulk transfer of 60 million ADABAS records into SQL Server completed in only 20 minutes.

We are very pleased to have yet another satisfied customer benefitting from one of Treehouse Software’s mature and proven enterprise software solutions.

tcVISION provides real-time data replication through change data capture, and allows easy and fast data migration for mainframe application modernization projects. Enterprises looking for a product that enables bi-directional heterogeneous data replication between mainframe, Linux, Unix, and Windows platforms need look no further than to tcVISION from Treehouse Software.

To learn more about tcVISION, or to request a demonstration, contact Treehouse Software today.

Cloud-y … with a 100 Percent Chance of Data

cloud computing and downloading

by Wayne Lashley, Chief Business Development Officer for Treehouse Software

Along with three of my colleagues, I recently participated in the Treehouse exhibit at the Gartner Application Architecture, Development and Integration (AADI) event in Las Vegas. This is a conference where we have exhibited in the past, and I personally have attended several other times. In fact, I just learned that the DI in AADI no longer stands for “Data Integration”; this change was only made in the past couple of years, as in the past it was the data integration aspect that made the show particularly relevant to Treehouse.

Though data integration vendors such as Informatica, Pervasive and Adeptia—and Treehouse—were in attendance, their numbers seemed diminished over prior years. And while “Legacy Modernization” had an entire subject “track” a couple of years ago, a number of “name” LM vendors were notably absent this year, and the topic was only rarely represented in sessions.

But there was a predominant theme at the event, and its name is Cloud.

People have been talking about “Cloud” for years already, and it is a well-established concept with many dimensions and extensive implementations. And it’s probably familiar enough to The Branches readers that I won’t waste words describing it, other than to say that it is simply a way to offer computing services via the Internet without the subscriber—most Cloud offerings are subscription-based—knowing or caring what or where the physical implementation is.

Many people consider that Salesforce.com is the granddaddy of all Cloud services, and to my mind it popularized the term “Software as a Service” (SaaS). Evolutionary Technologies, Inc. (ETI), a long-standing player in the data integration field and a company that I have had a lot of contact with over the years, reinvented itself around 2005 as a SaaS company, in doing so placing the company on the leading edge of “aaS” providers and essentially defining an entirely new market space.

These days there are a number of other “aaS” genres competing for mindshare and dollars, the most dominant being “Platform as a Service” (PaaS). Once again, Salesforce.com seemed to define the space initially, but others such as Amazon and Google have since come to dominate it. Just this week I was invited to an event for Oracle partners where Oracle executives will present their concept for an Oracle Cloud PaaS. I recall a Microsoft Worldwide Partners Conference (WWPC) a couple of years ago where Microsoft kicked off its Azure Cloud platform. You don’t have to install Microsoft Office on your PC anymore; Office 365 runs in the Cloud.

Even legacy applications are getting the Cloud treatment: a company called Heirloom Computing has commenced offering a platform for running legacy COBOL applications in the Cloud.

Cloud has also entered popular culture and commodity services. There’s a TV commercial that I keep seeing advertising a Cloud-based service that automatically troubleshoots, tunes up and cleans up your PC.
In short, you’re nobody if you’re not in the Cloud.

D-for-“Data” may have morphed into D-for-“Development” in the AADI Summit name, but data replication, integration and migration remain very relevant in the Cloud age. Indeed, you can’t spell Cloud without a D.

To support provisioning of Cloud-based applications, there has to be a means for getting data from where it is now—often in mainframe-based legacy databases or relational databases on open systems, within a company’s internal IT infrastructure—to the Cloud facilities, be they public or private. This doesn’t happen by magic. We have recently been working in a customer implementation where Oracle and DB2 data are being replicated bidirectionally in a Cloud implementation using our tcVISION solution. Such a scenario posed a bit of a challenge for us in terms of licensing: the machines on which tcVISION is installed are not specifically known at a given point in time. So we had to adapt our licensing model to accommodate the new reality.

We expect to see continued growth and demand for our replication and integration solutions as Cloud offerings evolve and expand. Furthermore, we are working on a new Cloud-oriented solution in collaboration with Cloud platform providers. I have briefed several Gartner analysts on it, and their feedback has been encouraging. Check back to this space regularly for news on this exciting new Treehouse offering.

Are You a Data Hoarder?

DataHoarder

by Joseph Brady, Marketing and Documentation Manager for Treehouse Software

I moved recently. I’ve heard it said that moving is one of life’s most stressful experiences…right up there with death. I would have to agree.

At some point during the process of evaluating every item I owned to determine whether to keep, toss or donate, it occurred to me that I could be the next star of The Hoarders reality show. It hit me especially hard when we unearthed the boxes that were tucked away under the staircase storage space that contained documents (or other essentials like my son’s artwork from kindergarten) that I hadn’t unpacked — let alone touched — since my last move over seven years ago. Over the years, I had accumulated a houseful of stuff — all of which at some point in time had been essential.

It took weeks to accomplish packing it all. Each item that I was going to move to the new place had to be carefully wrapped, boxed, and labeled, minimizing any breakage or losses and the amount of time the movers would have to spend on the other side. Not only would this potentially save us hundreds of dollars, but would expedite the unpacking and ensure my stuff in the new digs would be well organized and clutter-free. Looking back, this was time well spent.

So what does this have to do with your data?

Just as stuff inside a home can be hoarded, so can data! One of the long-running complaints about corporate IT is how data gets “siloed”, which constrains organizations in acting across internal boundaries. Enterprises generate huge amounts of structured and unstructured data stored in various data repositories across the organization and in production systems. There are many reasons behind data hoarding, but if you have a data archiving strategy, you can ensure that inactive data — especially that which may be inside legacy systems applications  — is stored, managed, secured or destroyed and that the data you keep can be accessed for any reason at any time.

For the purposes of this article, let’s talk about inactive data that is rarely or lightly used (or may not be used at all except on occasion when someone needs to look something up). While there may be myriad methods to access this data, including writing customized SQL queries, these may vary greatly in quality and sharability and you may need training to support them. Maintaining a mixed bag of individually knocked out SQL queries may prove to be a headache. What is needed is an agile, simple solution.

Introducing DataNovata.

DataNovata is a Rapid Application Development tool that instantly generates a secure, read-only, web-based application to query and navigate any database in any conceivable way. Simple focused and cost-effective tool, DataNovata is a perfect way to get a standard architecture in place as applications are being retired or decommissioned while the data still needs to be available for various purposes—regulatory/statutory, audit, historical analysis, etc.

The feature-rich, web-enabled applications generated by DataNovata are suitable for end users and give them powerful data interrogation facilities to facilitate finding the information they need quickly—with minimal training and technical expertise required. Organizations are realizing enormous cost savings by leveraging the power of DataNovata.

DataNovata is used for a variety of purposes such as:

  • Application Retirement: DataNovata can replace the enquiry facilities of an application should it need to be decommissioned or retired, e.g. superfluous to requirements, no longer supported, expensive to run, obsolete platform.
  • Archived Data Management: DataNovata can provide instant access to all your archived data, reducing the costs of supporting old and expensive platforms.
  • Enhance Access to Mainframe Data: Using the complementary product tcACCESS, DataNovata can be used to give access to data held in a variety of mainframe data sources including DB2, IMS/DB, CA-IDMS, CA-Datacom, VSAM and ADABAS.
  • Satisfy Legal Requirements: DataNovata can satisfy statutory legal requirements with regard to data retention and provide an automated deletion process for purging end-of-life data.
  • Platform and Application Rationalization: DataNovata can assist in streamlining your IT infrastructure.
  • Applications Portfolio Management: DataNovata provides a solution to the application redundancy and upgrade issues created by mergers and acquisitions, eliminating the associated licensing, maintenance and hardware costs.
  • Information Lifecycle Management: DataNovata can be central to your Information Lifecycle Management policy, responding to requests, retrieving data, providing access to authorized users and handling infrequently accessed information.
  • Forensic Analysis: DataNovata can become a powerful forensic tool for the detection of fraud, as the generated application follows the relationships within the data not necessarily used by the original application.
  • Analytical Databases: DataNovata provides the perfect user interface for analytical databases containing petabytes of data, even where there are no relationships defined in the database.
  • Application Renovation: DataNovata can renovate the user interface of an aging application with a modern, intuitive and feature-rich front-end, leaving any off-line processing unaffected.
  • Testing Facility: DataNovata can be an invaluable testing facility for application development or maintenance, where an independent but structured view of the data would be useful for verifying database operations.

Because of its universal use, DataNovata does not sit in any specific vertical market niche. However, its ability to provide users access to legacy data makes it well-suited for use by the financial, pensions, banking and insurance industries.

Now, if there were only something as easy as DataNovata that would help me unpack those last miscellaneous boxes, I could have my housewarming party.