Issue 2, 2016
By Sury Nagarajan, Senior Product Manager, Software AG
and Navdeep Sidhu, Senior Director, Product Marketing, Software AG
Learn how the webMethods adapter for JDBC® solves the problem of moving data from a transactional system to Hadoop® for analytics.
Digital transformation requires understanding customer behavior in order to provide them with an unmatched experience. To understand the customer, companies are increasingly using Hadoop to analyze large volumes of data about products they own, sales history, support tickets, and other information available from the enterprise’s existing applications.
There are several reasons behind the increasing adoption of Hadoop-based platforms. Hadoop runs on commodity hardware and is available as open source, as well as licensed, to suit all budget sizes. Given this low cost of entry, many companies start their Hadoop initiatives only to later realize the challenges of getting accurate and current data into the Hadoop environment.
In order to conduct analytics, transactional data available on ERP and other legacy applications must be moved to Hadoop. This is where companies face multiple challenges. Numerous analyst studies published over the past five years confirm that one of the top challenges to realizing the value of Big Data is the difficulty of integrating data into Big Data stores.
webMethods integrates big data
The most widely used product for connecting to databases is the webMethods adapter for JDBC. Every single webMethods customer is familiar with how easy it is to use the adapter.
Fortunately, this same adapter is now available for connecting to Hadoop over Apache® Hive™. The webMethods adapter for JDBC in combination with other adapters for packaged applications like SAP®, Oracle® and Peoplesoft® solves the problem of moving data from a transactional system to Hadoop for analysis.
The three products currently available are:
- webMethods –BigData driver for Apache Hive
- webMethods –BigData driver for MongoDB® 1
- webMethods –BigData driver for Cassandra™
Capabilities include read as well write:
- Delete (except Hive)
- Batch Insert
These drivers are for use with the webMethods adapter for JDBC to connect to Hive, MongoDB and Cassandra DB, respectively.
While Hive is a layered product on top of Hadoop that provides an SQL interface, MongoDB and Cassandra DB are NoSQL databases.
What’s coming later?
You can look forward to the following innovations to support big data analysis in the coming months:
- Connector for HDFS: A native connector for Hadoop with read/write capabilities and a variety of file formats (i.e., xml, csv, JSON, text, Apache Avro™, ORC) will be available in October 2016.
- Adapter for HBase®: A connector to HBase NoSQL store, based on our adapter technologies.
- Adapter for Kafka™: An adapter to publish and consume messages to Apache Kafka
Interested in using webMethods for big data integration?
Please reach out to your account executive and ask how you can add the new big data adapters to your existing webMethods landscape.
1 MongoDB is a registered trademark of MongoDB and the product name is under review by the MongoDB trademark team.