Developer Guide > Data Engineering Integration and Streaming Mapping and Workflows > Data Engineering Integration and Streaming Mapping and Workflows Overview
  

Data Engineering Integration and Streaming Mapping and Workflows Overview

Data Integration Hub uses Data Engineering Integration and Data Engineering Streaming to run Data Integration Hub big data publications and subscriptions.
You use Data Engineering Integration mappings to run custom batch publications and subscriptions that publish and consume large, diverse, and fast changing data sets. You use Data Engineering Integration workflows with multiple mappings in a workflow to run multiple custom batch publications and subscriptions that publish and consume large, diverse, and fast changing data sets. You use Data Engineering Streaming mappings to run custom multi-latency publications that publish streams of data in real time.
You create a Data Engineering Integration workflow by using multiple Data Engineering Integration mappings.
The Data Integration Service runs the Data Engineering Integration mapping, Data Engineering Streaming mappings, Data Engineering Integration workflows on the Hadoop environment.
You use the Developer tool to develop the Data Engineering Integration, Data Engineering Streaming mappings, and Data Engineering Integration Workflow that process the publications and subscriptions. You then use the Data Integration Hub Operation Console to import the mappings into a Data Integration Hub workflow.For details, see Creating a Data Integration Hub Workflow.
The Data Integration Hub operator creates a publication or a subscription in the Data Integration Hub Operation Console, and selects the Data Integration Hub workflow which is based on the Data Engineering Integration, Data Engineering Streaming mapping, or Data Engineering Integration Workflow. For more information, see the Data Integration Hub Operator Guide.

Sample mappings

You can find sample mappings in the following locations: