Migrate Legacy Data to Hadoop with a Solution from Qlik and Attunity

Share:

Technology partners, Qlik and Attunity have developed a technology solution that helps you get large-scale ERP data to the Hortonworks Data Platform where you can perform analytics.

ERP data can be hard to interact with at the database level. Legacy ERP architecture constructs offer a system that is a great for ERP, but a real challenge to get business insights from it.  

The solution experts from Qlik and Attunity have developed a technology solution that helps you get large-scale ERP data to Hadoop for analytics. This joint solution, called “SMASH”, makes ERP data available for business users who want to extract value from it. The name is spelled out in the bullets below in the red letters taken from the software that makes up the solution.

  • Qlik Sense: a highly flexible and scalable analytics platform for BI
  • Microsoft Azure: set of cloud services to help organizations meet their business challenges.
  • Attunity Replicate: software that accelerates data replication, ingest and streaming across a wide range of heterogeneous databases, data warehouses and data platforms
  • SAP: data management platforms to handle both transactions and analytics in memory on a single data copy
  • Hortonworks Data Flow (HDF): end-to-end platform that collects, curates, analyzes and acts on data in real-time with a drag-and-drop visual interface

The illustration below shows the architecture that makes up the SMASH solution. Note that this configuration is only one of multiple scenarios that encompass all flavors of SAP (ECC, BW, HANA) supported by Attunity. It can be deployed on any cloud, on-premise, or hybrid scenario with Hortonworks as the data target and Qlik as the analytics layer.

Attunity Replicate for SAP ERP Data Diagram
Combining multiple technologies can always be a challenge – especially with five separate moving components. However, the process of performing the SMASH integration was very smooth, and with the exception of a few configuration “learning” moments, was relatively painless.

To learn more:

Dev Tool:

Request: blog/migrate-legacy-data-to-hadoop
Matched Rewrite Rule: blog/([^/]+)(?:/([0-9]+))?/?$
Matched Rewrite Query: name=migrate-legacy-data-to-hadoop&page=
Loaded Template: single.php