Data Engineering Integration and Streaming Mapping and Workflows Overview
Data Integration Hub uses Data Engineering Integration and Data Engineering Streaming to run Data Integration Hub big data publications and subscriptions.
You use Data Engineering Integration mappings to run custom batch publications and subscriptions that publish and consume large, diverse, and fast changing data sets. You use Data Engineering Integration workflows with multiple mappings in a workflow to run multiple custom batch publications and subscriptions that publish and consume large, diverse, and fast changing data sets. You use Data Engineering Streaming mappings to run custom multi-latency publications that publish streams of data in real time.
You create a Data Engineering Integration workflow by using multiple Data Engineering Integration mappings.
The Data Integration Service runs the Data Engineering Integration mapping, Data Engineering Streaming mappings, Data Engineering Integration workflows on the Hadoop environment.
You use the Developer tool to develop the
Data Engineering Integration, Data Engineering Streaming mappings, and
Data Engineering Integration Workflow that process the publications and subscriptions. You then use the
Data Integration Hub Operation Console to import the mappings into a
Data Integration Hub workflow.
For details, see Creating a Data Integration Hub Workflow.The Data Integration Hub operator creates a publication or a subscription in the Data Integration Hub Operation Console, and selects the Data Integration Hub workflow which is based on the Data Engineering Integration, Data Engineering Streaming mapping, or Data Engineering Integration Workflow. For more information, see the Data Integration Hub Operator Guide.
Sample mappings
You can find sample mappings in the following locations:
- •Data Engineering Integration mappings: <DIHInstallationDir>/samples/bdm_mappings. Each sample mapping has an associated readme file that describes the sample mapping and contains guidelines for using the mapping as a basis to create your own mappings.
- •Data Engineering Streaming mappings: <DIHInstallationDir>/samples/bds_mappings. Under this folder, there are sub-folders with sample mappings for an Oracle publication repository and for a Microsoft SQL Server publication repository. The readme file that resides in this folder describes the sample mappings and contains guidelines for using the mappings as a basis to create your own mappings.
- •Data Engineering Integration Workflow: <DIHInstallationDir>/samples/bdm_workflows. Under this folder, there are sub-folders with sample workflows for an Oracle publication repository and for a Microsoft SQL Server publication repository. The readme file that resides in this folder describes the sample workflows and contains guidelines for using the workflows as a basis to create your own workflows.