Customer Success: tcVISION / Hadoop Integration

_tcVISION_Customer_Hadoop_Header

BAWAG P.S.K. is one of the largest and most profitable banks of Austria with more than 1.6 million private and business customers and a well known brand in the country. The business strategy is oriented towards low risk and high efficiency. Business segments are Retail Banking and Small Business, Corporate Lending and Investments and Treasury Services and Markets. The center of the BAWAG P.S.K. business strategy is the offering of easy to understand, transparent and first-rate products and services which meet the requirements of the customers.

The BAWAG P.S.K. Bank für Arbeit und Wirtschaft and Österreichische Postsparkasse Aktiengesellschaft ” (P.S.K.), Vienna, operates their IT on the z/OS operating system. The corporate data is stored in DB2 databases. ORACLE is the database platform for the Open System environments. In relationship with another project P.S.K already had a client component of tcVISION installed. Magister Markus Lechner, Head IT Applications: “tcVISION was already in use and we had good experiences as far as functionally and support is concerned. Because we were in the planning process for the implementation of another project, we included tcVISION in the list of software solutions. The goal of this project has been the reduction of the load on the IBM mainframe and as a result the reduction of costs. The intention was to offload data from our core database system to a less expensive system in real-time and to provide read access from that system to the new infrastructure. The reasons for this were constantly increasing CPU costs on the mainframe because of the growing transaction load of the Online Banking, Mobile Banking, and Self Service devices. A large percentage of the load was caused by Read-Only-Transactions.” Markus Lechner: “After the tcVISION presentation, we arranged a Proof Of Concept. The important aspects of the POC were not only the functionality of tcVISION within the project but we also wanted to see whether our expectations would be met related to performance and CPU consumption on the mainframe. In addition to tcVISION, we also evaluated another product. All of our expectations have been met to our full satisfaction during the POC and we made the decision to go ahead with tcVISION.”

After a short implementation period, the project is now in production for one year. Markus Lechner describes the project: “The primary objective of the project with tcVISION was the reduction of CPU load on the mainframe to reduce our costs. Our intention was to offload data from our core database system to a less expensive system in real-time and to provide read access from that system to the new infrastructure.

tcVISION_Hadoop

We use tcVISION for the realtime replication and we use Apache Hadoop as a cost efficient system for the storage of the data. In addition to the primary usage scenario we have the benefit to also cover additional use cases. This includes Real-time-EventHandling & Stream Processing, Analytics based upon real-time data as well as the possibility to report and analyze structured and unstructured data with excellent performance. The system can be inexpensively operated on Commodity Hardware and has no scalability limitations. Compared to the savings the costs of replication (CPU consumption) of tcVISION are very low. The support provided  was excellent during the implementation phase and also during the production phase. Inquiries by telephone or E-mail cause prompt reactions. Problems that came up during this period were solved as soon as possible even when the tcVISION software had to be extended.” There are additional plans to extend the use of tcVISION in the future. One is to implement real-time replication from ORACLE into the data lake.

Magister Markus Lechner draws a conclusion: “tcVISION enables us to significantly reduce our mainframe cost through a real-time replication to a less expensive environment. tcVISION performs a very economical log file based replication. In addition we are now in a position to implement numerous application cases based upon the replicated data which would have been too expensive and resource intensive on the mainframe. Realtime-Event-handling, Realtime-Analytics, Realtime-Fraud Prevention are only a few of the use cases that we currently cover.”


tcVISION

_0_tcVISION_Simple_Diagram

tcVISION enables bidirectional replication for DB2, Oracle, and SQL Server running on Linux/Unix/Windows, and synchronizes each data source, first by doing a bulk load from source(s) to target and then by replicating only changes— only committed changes—from source(s) to target. So there can never be ambiguity as to whether a query against the target database involves uncommitted data.

Read other tcVISON customer success stories here.


__TSI_LOGO

Visit the Treehouse Software website for more information on tcVISION, or contact us to discuss your needs.

Treehouse Software will be Exhibiting at CA World in November 2014

If you are attending CA World in Las Vegas in November, be sure to stop by the Treehouse Software booth and say hello!

We’ll be featuring our comprehensive and flexible portfolio of solutions for integration, replication, and migration of data between mainframe sources and any target, application or platform using ETL, CDC, SQL, XML and SOA technologies.

2014_CA_World

CA World ’14
November 9–12, 2014
Mandalay Bay Resort & Casino
Las Vegas, Nevada


Visitors to our exhibits will learn how Treehouse Software is currently providing several large organizations with ETL and real-time, bi-directional data replication using tcVISION. tcVISION provides easy and fast bi-directional data replication between mainframe, Linux, Unix, and Windows platforms.

tcVISION_Architecture


We will also showcase tcACCESS, which integrates mainframe data and applications with open systems and Windows.

tcACCESS_Diagram01


Meanwhile, if there is a mainframe data replication project in your future, contact Treehouse Software today.

Treehouse Software will present “Replicate Data in Real-time: Anytime, Anywhere – Live tcVISION Demonstration” at the 2014 WAVV Conference

Chris Rudolph, Senior Technical Representative for Treehouse Software will be presenting, “Replicate Data in Real-time: Anytime, Anywhere – Live tcVISION Demonstration” on Tuesday, April 15th at 11:00 AM, as part of WAVV’s vendor presentation series at the upcoming 2014 WAVV Conference, to be held April 13 – 16 at The Embassy Suites in Covington, KY.

Image

Chris will discuss how Treehouse Software is currently providing several large organizations with ETL and real-time, bi-directional data replication using tcVISION. tcVISION provides easy and fast bi-directional data replication between mainframe, Linux, Unix, and Windows platforms. Additionally, Chris will show a live demonstration of tcVISION in action.

About WAVV…

Image

WAVV is a user group promoting the interests of the users of the VSE, VM, and Linux operating systems. WAVV holds the annual conference, which consists of over 100 educational sessions as well as an exhibitor show where vendors of VSE, VM, and Linux related products show their wares and meet with customers.

More information on WAVV can be found on their website. http://www.wavv.org/

TREEHOUSE CUSTOMER UPDATE:

 

 

Image

by Chris Rudolph, Senior Technical Representative for Treehouse Software and Joseph Brady, Marketing and Documentation Manager for Treehouse Software

This is a follow-up to our recent Treehouse Software Blog entry “Treehouse Software is Setting Sights on Many New Data Replication Projects”, in which we described a typical customer visit to implement data replication.

Treehouse representatives were on-site at a state government agency to configure tcVISION and set up bulk transfer and change data capture, as well as train the State employees on using and managing tcVISION. In a subsequent discussion with our contacts at the site, they reported that their deadline for delivering a reporting database in Microsoft SQL Server replicated from 63 ADABAS files has been met. They also happily noted that by using tcVISION, the bulk transfer of 60 million ADABAS records into SQL Server completed in only 20 minutes.

We are very pleased to have yet another satisfied customer benefitting from one of Treehouse Software’s mature and proven enterprise software solutions.

tcVISION provides real-time data replication through change data capture, and allows easy and fast data migration for mainframe application modernization projects. Enterprises looking for a product that enables bi-directional heterogeneous data replication between mainframe, Linux, Unix, and Windows platforms need look no further than to tcVISION from Treehouse Software.

To learn more about tcVISION, or to request a demonstration, contact Treehouse Software today.

Cloud-y … with a 100 Percent Chance of Data

cloud computing and downloading

by Wayne Lashley, Chief Business Development Officer for Treehouse Software

Along with three of my colleagues, I recently participated in the Treehouse exhibit at the Gartner Application Architecture, Development and Integration (AADI) event in Las Vegas. This is a conference where we have exhibited in the past, and I personally have attended several other times. In fact, I just learned that the DI in AADI no longer stands for “Data Integration”; this change was only made in the past couple of years, as in the past it was the data integration aspect that made the show particularly relevant to Treehouse.

Though data integration vendors such as Informatica, Pervasive and Adeptia—and Treehouse—were in attendance, their numbers seemed diminished over prior years. And while “Legacy Modernization” had an entire subject “track” a couple of years ago, a number of “name” LM vendors were notably absent this year, and the topic was only rarely represented in sessions.

But there was a predominant theme at the event, and its name is Cloud.

People have been talking about “Cloud” for years already, and it is a well-established concept with many dimensions and extensive implementations. And it’s probably familiar enough to The Branches readers that I won’t waste words describing it, other than to say that it is simply a way to offer computing services via the Internet without the subscriber—most Cloud offerings are subscription-based—knowing or caring what or where the physical implementation is.

Many people consider that Salesforce.com is the granddaddy of all Cloud services, and to my mind it popularized the term “Software as a Service” (SaaS). Evolutionary Technologies, Inc. (ETI), a long-standing player in the data integration field and a company that I have had a lot of contact with over the years, reinvented itself around 2005 as a SaaS company, in doing so placing the company on the leading edge of “aaS” providers and essentially defining an entirely new market space.

These days there are a number of other “aaS” genres competing for mindshare and dollars, the most dominant being “Platform as a Service” (PaaS). Once again, Salesforce.com seemed to define the space initially, but others such as Amazon and Google have since come to dominate it. Just this week I was invited to an event for Oracle partners where Oracle executives will present their concept for an Oracle Cloud PaaS. I recall a Microsoft Worldwide Partners Conference (WWPC) a couple of years ago where Microsoft kicked off its Azure Cloud platform. You don’t have to install Microsoft Office on your PC anymore; Office 365 runs in the Cloud.

Even legacy applications are getting the Cloud treatment: a company called Heirloom Computing has commenced offering a platform for running legacy COBOL applications in the Cloud.

Cloud has also entered popular culture and commodity services. There’s a TV commercial that I keep seeing advertising a Cloud-based service that automatically troubleshoots, tunes up and cleans up your PC.
In short, you’re nobody if you’re not in the Cloud.

D-for-“Data” may have morphed into D-for-“Development” in the AADI Summit name, but data replication, integration and migration remain very relevant in the Cloud age. Indeed, you can’t spell Cloud without a D.

To support provisioning of Cloud-based applications, there has to be a means for getting data from where it is now—often in mainframe-based legacy databases or relational databases on open systems, within a company’s internal IT infrastructure—to the Cloud facilities, be they public or private. This doesn’t happen by magic. We have recently been working in a customer implementation where Oracle and DB2 data are being replicated bidirectionally in a Cloud implementation using our tcVISION solution. Such a scenario posed a bit of a challenge for us in terms of licensing: the machines on which tcVISION is installed are not specifically known at a given point in time. So we had to adapt our licensing model to accommodate the new reality.

We expect to see continued growth and demand for our replication and integration solutions as Cloud offerings evolve and expand. Furthermore, we are working on a new Cloud-oriented solution in collaboration with Cloud platform providers. I have briefed several Gartner analysts on it, and their feedback has been encouraging. Check back to this space regularly for news on this exciting new Treehouse offering.

Are You a Data Hoarder?

DataHoarder

by Joseph Brady, Marketing and Documentation Manager for Treehouse Software

I moved recently. I’ve heard it said that moving is one of life’s most stressful experiences…right up there with death. I would have to agree.

At some point during the process of evaluating every item I owned to determine whether to keep, toss or donate, it occurred to me that I could be the next star of The Hoarders reality show. It hit me especially hard when we unearthed the boxes that were tucked away under the staircase storage space that contained documents (or other essentials like my son’s artwork from kindergarten) that I hadn’t unpacked — let alone touched — since my last move over seven years ago. Over the years, I had accumulated a houseful of stuff — all of which at some point in time had been essential.

It took weeks to accomplish packing it all. Each item that I was going to move to the new place had to be carefully wrapped, boxed, and labeled, minimizing any breakage or losses and the amount of time the movers would have to spend on the other side. Not only would this potentially save us hundreds of dollars, but would expedite the unpacking and ensure my stuff in the new digs would be well organized and clutter-free. Looking back, this was time well spent.

So what does this have to do with your data?

Just as stuff inside a home can be hoarded, so can data! One of the long-running complaints about corporate IT is how data gets “siloed”, which constrains organizations in acting across internal boundaries. Enterprises generate huge amounts of structured and unstructured data stored in various data repositories across the organization and in production systems. There are many reasons behind data hoarding, but if you have a data archiving strategy, you can ensure that inactive data — especially that which may be inside legacy systems applications  — is stored, managed, secured or destroyed and that the data you keep can be accessed for any reason at any time.

For the purposes of this article, let’s talk about inactive data that is rarely or lightly used (or may not be used at all except on occasion when someone needs to look something up). While there may be myriad methods to access this data, including writing customized SQL queries, these may vary greatly in quality and sharability and you may need training to support them. Maintaining a mixed bag of individually knocked out SQL queries may prove to be a headache. What is needed is an agile, simple solution.

Introducing DataNovata.

DataNovata is a Rapid Application Development tool that instantly generates a secure, read-only, web-based application to query and navigate any database in any conceivable way. Simple focused and cost-effective tool, DataNovata is a perfect way to get a standard architecture in place as applications are being retired or decommissioned while the data still needs to be available for various purposes—regulatory/statutory, audit, historical analysis, etc.

The feature-rich, web-enabled applications generated by DataNovata are suitable for end users and give them powerful data interrogation facilities to facilitate finding the information they need quickly—with minimal training and technical expertise required. Organizations are realizing enormous cost savings by leveraging the power of DataNovata.

DataNovata is used for a variety of purposes such as:

  • Application Retirement: DataNovata can replace the enquiry facilities of an application should it need to be decommissioned or retired, e.g. superfluous to requirements, no longer supported, expensive to run, obsolete platform.
  • Archived Data Management: DataNovata can provide instant access to all your archived data, reducing the costs of supporting old and expensive platforms.
  • Enhance Access to Mainframe Data: Using the complementary product tcACCESS, DataNovata can be used to give access to data held in a variety of mainframe data sources including DB2, IMS/DB, CA-IDMS, CA-Datacom, VSAM and ADABAS.
  • Satisfy Legal Requirements: DataNovata can satisfy statutory legal requirements with regard to data retention and provide an automated deletion process for purging end-of-life data.
  • Platform and Application Rationalization: DataNovata can assist in streamlining your IT infrastructure.
  • Applications Portfolio Management: DataNovata provides a solution to the application redundancy and upgrade issues created by mergers and acquisitions, eliminating the associated licensing, maintenance and hardware costs.
  • Information Lifecycle Management: DataNovata can be central to your Information Lifecycle Management policy, responding to requests, retrieving data, providing access to authorized users and handling infrequently accessed information.
  • Forensic Analysis: DataNovata can become a powerful forensic tool for the detection of fraud, as the generated application follows the relationships within the data not necessarily used by the original application.
  • Analytical Databases: DataNovata provides the perfect user interface for analytical databases containing petabytes of data, even where there are no relationships defined in the database.
  • Application Renovation: DataNovata can renovate the user interface of an aging application with a modern, intuitive and feature-rich front-end, leaving any off-line processing unaffected.
  • Testing Facility: DataNovata can be an invaluable testing facility for application development or maintenance, where an independent but structured view of the data would be useful for verifying database operations.

Because of its universal use, DataNovata does not sit in any specific vertical market niche. However, its ability to provide users access to legacy data makes it well-suited for use by the financial, pensions, banking and insurance industries.

Now, if there were only something as easy as DataNovata that would help me unpack those last miscellaneous boxes, I could have my housewarming party.