Mapping Validation
When you develop a streaming mapping, you must configure it so that the Data Integration Service can read and process the entire mapping. The Developer tool marks a mapping as not valid when it detects errors that will prevent the Data Integration Service from running the mapping.
The Developer tool considers the following types of validation:
- •Environment
- •Data object
- •Transformation
- •Run-time
Environment Validation
The Developer tool performs environment validation each time you validate a streaming mapping.
The Developer tool generates an error in the following scenarios:
- •The mapping has Native environment.
- •The mapping does not have a Spark validation environment and Hadoop execution environment.
- •The mapping has a Hadoop execution environment and has Native, Blaze, Hive on MapReduce, and Spark validation environment.
Data Object Validation
When you validate a mapping, the Developer tool verifies the source and target data objects that are part of the streaming mapping.
The Developer tool generates an error in the following scenarios:
- •The mapping contains a source data object other than Kafka or JMS, and target data object other than Kafka, complex file, or relational data object.
- •The read and write properties of the Kafka, JMS, and complex file data objects are not specified correctly.
- •If you add a Kafka or a JMS data object to an LDO mapping, REST mapping, or a mapplet.
Transformation Validation
When you validate a mapping, the Developer tool performs validation on the transformations that are part of the streaming mapping.
The Developer tool performs the following validations:
- •A mapping cannot contain a transformation other than the Aggregator , Expression, Filter , Joiner, Lookup, Router, Union, and Window transformations.
- •A Window transformation is added between a streaming source and a Sorter, Aggregator, or Joiner transformation.
- •A Window transformation has at least one upstream streaming source.
- •All Window transformations have a slide interval that is a multiple of the mapping batch interval.
- •A Window transformation that is downstream from another Window transformation must have a slide interval that is a multiple of the slide interval of the upstream Window transformation.
- •The slide interval of a sliding Window transformation must be less than window size.
- •The format of the parameter of the window size must have the TimeDuration parameter type.
- •The window size and slide interval of a Window transformation must be greater than 0.
- •The downstream Window transformation in the pipelines leading to a Joiner transformation must have the same slide intervals.
- •A Window transformation cannot be added to a Logical Data Object mapping, REST mapping, or a mapplet.
- •If one pipeline leading to a Union transformation has a Window transformation, all streaming pipelines must have a Window transformation. All downstream Window transformations in the pipelines leading to the Union transformations must have the same slide interval.
- •A Union transformation cannot be used to merge data from streaming and non-streaming pipelines.
- •A Union transformation does not require a Window transformation between a streaming source and itself.
Run-time Validation
The Developer Tool performs validations each time you run a streaming mapping.
The Developer tool generates an error in the following scenarios:
- •The state store is not configured when you specify the source configuration properties for the mapping.
- •Either the Maximum Rows Read or the Maximum Runtime Interval property is not configured when you specify the source configuration properties for the mapping.
- •The Maximum Runtime Interval does not have the correct time format.
- •The Batch Interval does not have the correct time format.
- •If the default value of the Batch Interval and Maximum Runtime Interval properties are not specified.