Release Guide > New Features (10.0) > Informatica Mappings
  

Informatica Mappings

This section describes new mapping features in version 10.0.

Dynamic Mappings

Effective in version 10.0, you can configure dynamic mappings to change sources, targets, and transformation logic at run time based on parameters and rules that you define. You can determine which ports a transformation receives, which ports to use in the transformation logic, and which links to establish between transformation groups. Dynamic mappings enable you to manage frequent metadata changes to the data sources or to reuse the mapping logic for different data sources with different schemas.
Dynamic mappings include the following features that you can configure:
For more information about dynamic mappings, see the "Dynamic Mappings" chapter in the Informatica 10.0 Developer Mapping Guide.

Mapping Outputs

Effective in version 10.0, you can create mapping outputs that return aggregated values from the mapping run. Mapping outputs are the result of aggregating a field value or an expression from each row that a mapping processes.
For example, you can configure a mapping output to summarize the total amount of an order field from the source rows that the transformation receives. You can persist a mapping output value in the repository. You can assign a persisted mapping output value to the Mapping task input parameter. You can also assign mapping outputs to workflow variables.
Create a mapping output in the mapping Outputs view. Define the expression to aggregate in an Expression transformation in the mapping.
For more information, see the Informatica 10.0 Developer Mapping Guide.

Mapping Task Input

Effective in version 10.0, you can assign persisted mapping outputs to input parameters of the same Mapping task. Persisted mapping outputs are mapping outputs that the Data Integration Service saved in the repository from a previous workflow run. For example, you might choose to persist the latest order date from a previous workflow run. In the Mapping task Input view, you can assign the persisted value to an input parameter. You might include the input parameter in a filter expression to skip rows with order dates that are less than the last date.
For more information, see the Mapping Tasks chapter in the Informatica 10.0 Developer Workflow Guide.

Mapping Task Output

Effective in version 10.0, you can assign mapping outputs to workflow variables. You can assign current user-defined mapping outputs and persisted user-defined mapping outputs to workflow variables. The current value is a value that the Mapping task generated in the workflow that is running. The persisted mapping output is a value that is in the repository from a previous run. You can also assign system-defined mapping outputs to workflow variables. Assign mapping outputs to workflow variables in the Mapping task Output view.
For more information, see the Mapping Tasks chapter in the Informatica 10.0 Developer Workflow Guide.

Optimization Methods

Effective in version 10.0, Informatica has the following new features for optimization methods:
Global predicate optimization method
The Data Integration Service can apply the global predicate optimization method. When the Data Integration Service applies the global predicate optimization method, it splits, moves, removes, or simplifies the filters in a mapping. The Data Integration Service filters data as close to the source as possible in the pipeline. It also infers the predicate expressions that a mapping generates.
For more information, see the "Mapping Optimization" chapter in the Informatica 10.0 Performance Tuning Guide.
Pushdown optimization method
You must select a pushdown type to push transformation logic to the source database. You can choose to push down none of the transformation logic, partial transformation logic, or full transformation logic to the source database. You can also view the mapping optimization plan for the pushdown type.
If the mapping has an Update Strategy transformation, you must determine pushdown compatibility for the mapping before you configure pushdown optimization.
For more information, see the "Pushdown Optimization" chapter in the Informatica 10.0 Developer Mapping Guide.
Dataship-join optimization method
If a mapping requires data in two different sized tables in different databases to be joined, the Data Integration Service can apply the dataship-join optimization method.
For more information, see the "Mapping Optimization" chapter in the Informatica 10.0 Performance Tuning Guide.
Mapping Optimization Plan
You can view how optimization methods affect mapping performance in a mapping optimization plan.
For more information, see the "Mapping Optimization" chapter in the Informatica 10.0 Performance Tuning Guide.

Parameters

Effective in version 10.0, Informatica has the following new features for parameters:
Parameter usage
You can use parameters to represent additional properties such as connections, SQL statements, sort and group-by port lists, expression variables, and run time environment.
Parameter types
You can use the following parameter types for dynamic mappings: expression, input link set, port, port list, resource, and sort list.
Binding parameters between mappings, mapplets, and transformations
You can bind mapping parameters to mapplet parameters or to transformation parameters in the Instance Value column of a Parameters tab. You can also bind mapplet parameters to transformation parameters.
When you bind a parameter to another parameter, the parameter overrides the other parameter at run time. You can create a mapping or a mapplet parameter from an existing parameter and bind the parameters in one step. Click the Expose as Mapping Parameter option or the Expose as Mapplet Parameter option for the parameter you want to override.
You can bind parameters from a mapping to parameters in a Read or Write logical data object mapping.
Parameter sets
You can define a parameter set for a workflow or mapping. A parameter set is an object in the Model repository that contains a set of parameters and parameter values to use at run time. You use a parameter set with a mapping, Mapping task, or workflow. You can add one or more parameter sets to an application when you deploy the application. You can add a parameter set to multiple applications and deploy them.
Run-time environment parameter
You can set the run-time environment with a parameter. Configure a string parameter at the mapping level. Set the default value to Native or Hadoop. When you select the run-time environment for the mapping, click Assign Parameter and select the parameter that you configured.
For more information about parameters, see the Mapping Parameters chapter in the Informatica 10.0 Developer Mapping Guide.

Partitioned Mappings

Effective in version 10.0, Informatica has the following new features for partitioned mappings:
Partitioned transformations
Additional transformations support partitioning. When a mapping enabled for partitioning contains the following transformations, the Data Integration Service can use multiple threads to transform the data:
Cache partitioning
For an Aggregator, Joiner, or Rank transformation, you can configure multiple cache directories to optimize performance during cache partitioning for the transformation. You can use the default CacheDir system parameter value if an administrator configured multiple cache directories for the Data Integration Service. Or, you can override the default CacheDir system parameter value to configure multiple cache directories specific to the transformation.
For a Sorter transformation, you can configure multiple work directories to optimize performance during cache partitioning for the transformation. You can use the default TempDir system parameter value if an administrator configured multiple temporary directories for the Data Integration Service. Or, you can override the default TempDir system parameter value to configure multiple directories specific to the transformation.
Mappings that order data
The Data Integration Service can create partitions for a mapping that establishes a sort order. You can establish sort order in a mapping with a sorted flat file source, a sorted relational source, or a Sorter transformation. When the Data Integration Service adds a partition point to a mapping, it might redistribute data and lose the order established earlier in the mapping. To maintain order in a partitioned mapping, you must specify that Expression, Java, Sequence Generator, SQL, and Write transformations maintain the row order in the transformation advanced properties.
Partitioned flat file targets
To optimize performance when multiple threads write to a flat file target, you can configure multiple output file directories for a flat file data object. You can use the default TargetDir system parameter value if an administrator has configured multiple target directories for the Data Integration Service. Or, you can override the default TargetDir system parameter value to configure multiple output file directories specific to the flat file data object.
Suggested parallelism value for transformations
If you override the maximum parallelism for a mapping, you can define a suggested parallelism value for a specific transformation. The Data Integration Service uses the suggested parallelism value for the number of threads for that transformation pipeline stage as long as the transformation can be partitioned. You can define a suggested parallelism value that is less than the maximum parallelism value defined for the mapping or the Data Integration Service. You might want to define a suggested parallelism value to optimize performance for a transformation that contains many ports or performs complicated calculations.
For more information about partitioned mappings, see the "Partitioned Mappings" chapter in the Informatica 10.0 Developer Mapping Guide.

Run-time Properties

Effective in version 10.0, you can configure the following run-time properties for a mapping:
Stop on Errors
Stops the mapping if a nonfatal error occurs in the reader, writer, or transformation threads. Default is disabled.
Target Commit Interval
The number of rows to use as a basis for a commit. The Data Integration Service commits data based on the number of target rows that it processes and the constraints on the target table.
For more information, see the Informatica 10.0 Developer Mapping Guide.

Target Load Order Constraints

Effective in version 10.0, you can configure constraints to control the order in which rows are loaded and committed across target instances in a mapping. Define constraints on the Load Order tab of the mapping Properties view. Each constraint consists of a primary target name and a secondary target name to restrict the load order.
For more information, see the Informatica 10.0 Developer Mapping Guide.