Developer Guide > Data Engineering Integration and Streaming Mapping and Workflows > Developing Data Engineering Streaming Mappings for Publications
  

Developing Data Engineering Streaming Mappings for Publications

To develop a Data Engineering Streaming mapping for a publication, perform the following steps in the Developer tool:
  1. 1. Create source and target connections. The source connection is a connection to the publishing application and the target connection is a connection to the Data Integration Hub publication repository.
  2. 2. Create source and target data objects.
  3. 3. Create a mapping and add the source and target objects to the mapping.
  4. 4. Add an Expression transformation to the mapping, configure ports in the transformation, and connect ports between the source and the transformation.
  5. 5. Add a Java transformation to the mapping and map fields from the Expression transformation to the Java transformation and from the Java transformation to the target.
  6. 6. Configure the mapping run-time environment and create an application from the mapping.

Step 1. Create Source and Target Connections

    1. Create a Streaming source connection to the publishing application.
    2. Create a target connection to the Data Integration Hub publication repository.

Step 2. Create Source and Target Data Objects

Create data objects under Physical Data Objects.
    1. Create a source data object and define the column projection in the source connection to publish.
    2. Create a target data object and select the table in the target connection to where to publish the data from the source. The object must be a relational data object.

Step 3. Create a Mapping with Source and Target

    1. Create and name a new mapping.
    2. Add the source physical data object to the mapping as a Reader.
    3. Add the target physical data object to the mapping as a Writer.

Step 4. Add an Expression Transformation to the Mapping

    1. Add an Expression transformation to the mapping, between the source and target objects.
    2. Link all the ports from the source object to the identical ports in the Expression transformation.
    3. Configure the following additional ports in the Expression transformation:
    Port
    Description
    PublicationName
    Name of the publication.
    DX_SERVER_URL
    A valid Data Integration Hub RMI URL. For example: rmi://localhost:18095.

Step 5. Add a Java Transformation to the Mapping

    1. Copy the Java transformation from the sample mapping. You can find sample mappings in the following location: <DIHInstallationDir>/samples/bds_mappings. Under this folder, there are sub-folders with sample mappings for an Oracle publication repository and for a Microsoft SQL Server publication repository.
    2. Map the following fields from the Expression transformation to the Java transformation:
    3. Map the following fields from the Java transformation to the topic table in the target object:
    Java Transformation Field
    Topic Table Field
    DXPublicationInstanceID
    DIH__PUBLICATION_INSTANCE_ID
    DXPublicationInstanceDate
    DIH__PUBLICATION_INSTANCE_DATE
    4. In the target transformation, open the target data object and change the data type of DIH__PUBLICATION_INSTANCE_ID from decimal to bigint .
    5. Save the mapping.

Step 6. Configure the Mapping Run-time Environment and Create an Application

    1. In the Properties pane select Run-time, and then, under Validation Environments, select Hadoop and then select Spark. Verify that Native is not selected.
    2. Create an application from the mapping.
    The mapping is deployed to the Data Integration Service for Hadoop environment.