Data Quality Mappings and Workflows Overview
Data Quality is a processing engine that Data Integration Hub uses to run Data Integration Hub custom publications and subscriptions for on-premise applications. The Data Integration Service runs the Data Quality mappings and workflows on the native environment.
You use the Developer tool to develop the Data Quality mappings that process the publications and subscriptions. You then use the
Data Integration Hub Operation Console to import the Data Quality mapping into a
Data Integration Hub workflow.
For details, see Creating a Data Integration Hub Workflow.You create a Data Quality workflow by using multiple Data Quality mappings.
The Data Integration Hub operator creates a publication or a subscription in the Data Integration Hub operation console, and selects the Data Integration Hub workflow which is based on the Data Quality mapping or Data Quality workflow. For more information, see the Data Integration Hub Operator Guide.
You can find sample mappings in the following directory: <DIHInstallationDir>/samples/idq_mappings. Each sample mapping has an associated readme file that describes the sample mapping and contains instructions.