Stay in the loop with all of the happenings at Treehouse Software by signing up for Treehouse Software’s monthly eBulletin! You’ll receive tips and tricks for Treehouse products, news, and interesting mainframe-related articles from around the web. SUBSCRIBE HERE.
by Joseph Brady, Manager of Marketing and Technical Documentation at Treehouse Software, Inc.
tRelational, the data analysis, modeling, and mapping component of Treehouse Software’s Adabas-to-RDBMS product set provides three options for developing RDBMS data models and mapping ADABAS fields to RDBMS columns:
Option 1: Auto-generation
Option 2: Importation of existing RDBMS schema elements
Option 3: Detailed definition and manipulation of schema and mapping elements using tRelational
The Auto-generation function can be an extremely useful productivity aid. By simply invoking this function and specifying an Adabas file structure, a fully-functional corresponding RDBMS schema (Tables, Columns, Primary Keys, Foreign Key relationships and constraints) and appropriate mappings are created virtually instantaneously. The table and column names, datatypes, lengths, and mappings/transformations are all automatically tailored specifically for the RDBMS product, platform, and version–the user need not be an expert in the RDBMS.
tRelational’s schema auto-generation simply requires specification of an Adabas file structure and associated options…
The auto-generated model can be immediately used to generate both RDBMS DDL and parameters for the Data Propagation System (DPS) component. Within minutes of identifying the desired Adabas file or files to tRelational, the physical RDBMS schema can be implemented on the target platform and DPS can begin materializing and propagating data to load into the tables.
It is important to note that these modeling options complement each other and can be used in combination to meet any requirements. Auto-generated schema elements can be completely customized “after the fact”, as can imported elements. Auto-generation can be used at the file level to generate complete tables and table contents at the field level, making it easy to then manually define and map one or more columns within a table, or even to denormalize MU/PE structures into a set of discrete columns.
About Treehouse Software’s tRelational / DPS Product Set
tRelational / DPS is a robust product set that provides modeling and data transfer of legacy Adabas data into modern RDBMS-based platforms for Internet/Intranet/Business Intelligence applications. Treehouse Software designed these products to meet the demands of large, complex environments requiring product maturity, productivity, feature-richness, efficiency and high performance.
The tRelational component provides complete analysis, modeling and mapping of Adabas files and data elements to the target RDBMS tables and columns. DPS (Data Propagation System) performs Extract, Transformation, and Load (ETL) functions for the initial bulk RDBMS load and incremental Change Data Capture (CDC) batch processing to synchronize Adabas updates with the target RDBMS.
by Wayne Lashley, Chief Business Development Officer for Treehouse Software
Recently, four of us from Treehouse Software—Mitch Doricich, Mike Kuechenberg, Chris Rudolph and myself—attended Software AG’s TechEd 2015, which took place on April 21-22, 2015 at the Chicago O’Hare Hyatt Regency Hotel. We were joined by over 180 other attendees, including about 60 representing Adabas/Natural sites.
The event was organized into separate tracks for Adabas/Natural, webMethods, ARIS and Alfabet, but the morning general sessions were for all attendees.
Tim Fortier of Software AG acted as the emcee for the two-day event. There were general presentations and keynotes by various Software AG directors and executives, including Mike Schumpert, Gerd Schneider, Mighael Botha, Cynthia Stegall, Kurt Hansen and Ricardo Leitão, on topics like the Internet of Things (IoT), Software AG’s TECHCommunities, innovation, training and support.
Besides the general sessions, the Treehouse team attended the Adabas/Natural breakout sessions. These included presentations by a number of familiar figures from Software AG, including Joe Gentry, Karlheinz Kronauer, Patrick Gould, Eric Wood, Becky Albin and Bruce Beaman, covering Adabas and Natural (and add-on products) roadmaps and futures, and new products and capabilities in the analytics, mobile and integration spaces.
Customer presentations were provided by the Geoff Wells and Don Ellis from the State of Minnesota describing tremendous CPU savings using the zIIP Enabler for Natural, and also detailing their early experiences with the new Event Analytics for Adabas, by Amarish Pathak from AAFMAA on integrating Adabas/Natural/EntireX with mobile applications, and by Manny Klonaris from Verizon on Adabas Vista.
I found the Event Analytics for Adabas solution to be particularly compelling in today’s world of stringent security, audit and privacy requirements. Becky remarked that it could be called “Event Analytics for Anything”, because the data/events to be analyzed could involve virtually any source. Treehouse TRIM customers could be taking advantage of this solution today for their Adabas data, and analytics based on different data sources could be enabled through other Treehouse products and services.
A featured component of the event was Becky giving a Master’s Class on Adabas v8.3. Mike and Chris, the Treehouse technical guys, were logged in to the Treehouse mainframe during the class so that they could follow along on our own system while Becky went through her material. The numerous other DBAs in the room also got great insight into the features and capabilities of this latest Adabas release.
In addition, I had the privilege of giving a presentation on behalf of Treehouse, speaking about how Treehouse can help maximize and revitalize Adabas/Natural through our products and services. I want to say a big thanks to Thadd Jenkins from Southwest Airlines, who provided a very gratifying testimonial about the long and mutually-beneficial relationship between Treehouse and Southwest.
My presentation is available for download here.
TechEd 2015 was hugely beneficial for the attendees to meet and mingle with Software AG experts, management and with other customers. For readers who were unable to make it this year, another TechEd is likely to be scheduled for 2016, based on the roundtable discussion and customer feedback. I encourage all customers to sign up for this free conference.
Our thanks to Software AG for organizing a very successful TechEd 2015, and for enabling Treehouse to participate in it.
Did you know that Treehouse Software offers online demonstrations of the most complete and flexible portfolio of solutions available anywhere for real-time, bidirectional data replication and integration between mainframe and LUW data sources?
You can see how Treehouse Software’s popular tcACCESS and tcVISION products efficiently and cost-effectively use ETL, CDC, SQL, XML, and SOA technologies for data replication / integration, in an interactive demonstration with our skilled technical experts.
Integrate mainframe data and applications with LUW data sources…
tcACCESS is a comprehensive software solution that enables two-way integration between IBM mainframe systems and client/server, Web and SOA technologies — without the need for mainframe knowledge or programming effort. A proven platform that facilitates SQL-based integration of mainframe data sources and programs into LUW applications, tcACCESS uses industry standards such as SQL, ODBC, JDBC, and .NET. SQL queries to access mainframe data can be easily created using drag and drop techniques — no programming required.
tcACCESS is a modular software solution. It consists of a base system that can either be implemented as a CICS transaction or as a VTAM application. The base system provides its own communication modules. The heart of the system is the tcACCESS SQL Engine which allows access to mainframe data sources using SQL statements. tcACCESS offers Listener components on the mainframe and on the client, as well as scheduling and security functions. Batch processors automate the information exchange processes between distributed applications.
Enable ETL and bi-directional data replication between mainframe and LUW platforms…
tcVISION allows the exchange of data between heterogeneous databases, from legacy non-relational mainframe sources to standard RDBMSs, in batch or real-time, via CDC (change data capture). With tcVISION, complex replication scenarios can be implemented with ease–including bi-directional “master/master” replication requirements.
tcVISION considerably simplifies mainframe data exchange processes. The structure of the existing mainframe data is analyzed by tcVISION processors, then automatically and natively mapped to the target. The data mapping information is presented in a user-friendly and transparent format – even for users with no mainframe knowledge.
See for yourself, right at your desk…
Tell us about your challenges. If you have a project where our mainframe data replication and integration products could be of assistance, our skilled sales and technical staff would be happy to set up a free, online demo. Simply fill out our short Treehouse Software Demo Request Form.
Treehouse Software representatives are preparing for the upcoming WAVV Conference, which will be held April 13 – 16 at The Embassy Suites in Covington, KY. If you are attending the conference, stop by the Treehouse Software booth and say hello!
We will be exhibiting our data integration and replication products:
- tcVISION for enabling ETL and bi-directional data replication between mainframe, Linux, Unix and Windows platforms.
- tcACCESS for integrating mainframe data and applications with open systems and Windows.
Additionally, Chris Rudolph, Senior Technical Representative for Treehouse Software will be presenting, “Replicate Data in Real-time — Anytime, Anywhere (Live tcVISION Demonstration)” on Tuesday, April 15 at 11:00 AM as part of WAVV’s vendor presentation series.
WAVV is a user group promoting the interests of the users of the VSE, VM, and Linux operating systems. WAVV holds the annual conference, which consists of over 100 educational sessions as well as an exhibitor show where vendors of VSE, VM, and Linux related products show their wares and meet with customers.
More information on WAVV can be found on their website.
Guest blogger Howard Sherrington, CEO of NSC LegacyData Solutions Ltd., developers of DataNovata, discusses how retiring legacy applications can cut IT costs and free up resources – if the right approach is taken to accessing the legacy data.
The one constant in business IT is that yesterday’s new systems will become tomorrow’s legacy. As organizations evolve and new technologies emerge, IT departments have to deal with the impact of this constant evolution and change on business operations.
But change events are difficult to predict. Mergers or acquisitions can transform even recent deployments into duplicate or legacy systems, as the IT function struggles to keep up with the changing demands of the business.
Legacy systems represent a drain on IT resources, both in terms of cost and manpower. Industry analyst Gartner conservatively estimates that businesses spend around 10% – 25% of their IT budget in supporting and managing legacy systems – and I believe that this can rise to as much as 35% in some organizations. This is especially true for organizations that are running old and complex systems and applications.
So, given the current business climate, where capital for new projects is harder to come by and operational expenditure for existing systems under close scrutiny, it’s prudent to look for more efficient ways of dealing with these older applications than to simply keeping them running.
It’s all about the data
In these circumstances, it’s essential to establish why the legacy system is being maintained. In the vast majority of cases, it’s because of the data held in the system or application.
I frequently hear the statement, “No, we can’t do anything with that service, as we still need it”, even though nine times out of 10 it’s simply access to the data that’s required, rather than the application itself. This also correlates with analyst Forrester’s estimation that nearly 85% of data in databases is inactive, simply being stored for subsequent access rather than being processed.
This is often the case for financial systems, especially in pensions and investment management, which usually lie dormant – at a considerable cost, as we saw earlier – so the business has access to the legacy data for legal, taxation, due diligence or compliance purposes.
A retirement bonus
So why not retain the vital elements and let go of the redundant parts? By separating the legacy data that the business needs from the legacy system, and then decommissioning the applications and platforms, a business could make substantial savings in budgets, support and resource commitment.
There’s also the opportunity to increasing the efficiency of operations by giving staff wider, more flexible access to the legacy data. Let’s take a closer look at how application retirement should be approached and managed, and the benefits it can offer your business.
The initial issue in application retirement is the migration of the data from the legacy application or platform. Exactly how this is done will depend on several factors, including the type and age of the application and platform, and how the data is stored. However, there are a couple of key ‘best practice’ points which should be observed in any migration project.
First, reduce the risk of data loss or damage by testing your procedures. Make sure you have a backup copy of the data before trying a migration, and if possible, pilot the process using a small subset of the data. Then you can compare the extracted data with the original to ensure the process isn’t changing the data in any way.
Second, ensure the data is migrated into a database format that is accessible by the widest range of applications, and that can run on low-cost, flexible computing platforms – ideally a structured, relational SQL-compliant database. This helps to ensure flexible, open access to the data for a range of different user types. Data structuring tools are available to simplify migrations to relational databases.
Access all areas
Once the migration is complete, the focus should be on how users will access the data: on building the applications that will support easy, flexible but robust data access. The key to this is to use a tool that takes advantage of the open, Web model to run on any hardware and operating system at both the server and client side, delivering customizable data views and queries within a browser-based interface.
This gives even non-technical users uniform access to data migrated from legacy systems from a familiar point-and-click interface – minimizing the need for user training. It also helps organizations avoid ongoing licensing, maintenance and hardware costs for access to legacy data, and can give access to data over the Web from any location.
This makes application retirement a more efficient and cost-effective way of dealing with legacy systems, compared to other alternatives such as modernizing the application – which may mean expensive hardware updates, terminal emulation and so on – or transferring to a virtualized environment, which can carry a significant penalty in migration costs and ongoing management.
Considering the benefits
By retiring legacy systems, considerable benefits can be realized. Firstly, analysts estimate that payback of outlay is often less than 12 months and the total ROI over three years usually exceeds 150%.
What’s more, by decommissioning your legacy systems, your IT team can focus on more strategic tasks than maintenance and support for old platforms. There are also benefits such as reduced risk due to outages, acceleration of new product initiatives due to fewer integration or support issues, as well as a more streamlined disaster recovery plan.
So with the benefits that application retirement can offer, letting go of your IT department’s past while preserving the business information could make a key impact on your operations. Applications and platforms will come and go, but data is forever.