You must configure a mapping that contains a Source transformation for a Adabas source and map it to a Target transformation.
To begin creating a mapping in Data Integration, click New > Mappings > Mapping, and then click Create. In the New Mapping dialog box, click Mapping > Continue.
A mapping must contain at least one Source transformation mapped to one Target transformation. Alternatively, a mapping can contain multiple Adabas sources and multiple targets. For example, you can define multiple one-to-one mappings, map one source to multiple targets, or map multiple sources to one target. To map multiple sources to the same target, you can create multiple one-to-one mappings that map each source separately to the same target. In this case, each source must have a different primary key value.
Important: If you include multiple sources in a mapping, all of the Source transformations must use the same Adabas connection, which connects to the same PowerExchange Listener and source location.
When you define a Source transformation in a mapping, specify a name for the transformation, an Adabas connection that can connect to the PowerExchange Listener, the source type, and the data map for the Adabas source object. You can also set some additional source properties. The PWX Bulk Metadata Adapter retrieves metadata for the source table from the PowerExchange data map, including the field definitions.
When you define a Target transformation, specify a name for the transformation, a connection to the target, the target type, and the operation type of Data Driven. In the user's environment, the target can reside on premises or in a cloud. If you do not want to send all of the incoming fields to the target, you can define one or more exclusion field rules. As part of configuring the target, you must also map the incoming fields to the target fields. You can map fields automatically based on the field names or manually map them.
Instead of selecting specific connections and objects in the Source and Target transformations, you can define a parameter for the connection, the source or target object, or both. You then must enter a specific connection or object in the associated mapping tasks. If you want to use a parameter for the source or target connection, you must perform one of the following actions before the mapping can be successfully validated:
•Set the Source Type or Target Type field to Parameter and then enter a parameter for the source or target object in the Parameter field. On the Field Mapping tab of the Target transformation, you must set the map option to Automatic for the mapping to be validated.
•If you want to use a specific source or target object with a connection parameter, first select a specific connection, set the Source Type or Target Type field to Single Object, select a source or target object in the Object field, validate the mapping, and then enter the connection parameter.
Tip: If you want to validate the source-to-target field mappings, map the fields in the Target transformation before validating the mapping. After the mapping is confirmed as valid, enter connection parameters.
You can also completely or partially parameterize field mappings, and then define specific mappings in the mapping tasks.
Adabas source properties in mappings
When you create a mapping, you configure a Source transformation for a Adabas source and set some source properties for the Source transformation.
Click the Source transformation box on the mapping canvas. Then configure the source properties in the Source Properties panel below the mapping canvas.
If you have multiple Source transformations in a mapping, configure source properties for each source. If you use a specific connection instead of a connection parameter, enter the same connection for all sources.
Note: If a mapping includes source tables or columns that have special characters in their names, the associated mapping task fails because it cannot import the source metadata. Special characters include s #, $, @, %, *, !, and ~. Therefore, ensure that none of the source tables and mapped columns have names with any special character. Rename the tables and columns if necessary.
The Source Properties panel contains the following tabs on which you can enter information:
•General tab. Specify a user-defined name and optional description for the Adabas source.
•Source tab. Configure the source properties under Details and under Advanced. The advanced properties are available only if you enter a specific connection for the source. If you enter a parameter for the source connection, the advanced properties are not displayed. However, you can define these properties in the associated mapping task after you select a specific source connection for the task.
Note: The Filter and Sort options under Query Options are not supported for connections. If you specify a parameter for the source connection, the Sort and Filter options are available but will be ignored if you configure them.
The following table describes the Adabas source properties under Details:
Property
Description
Connection
Required. Select a source connection of the type Adabas , or click New Connection to create one.
Alternatively, you can define a connection parameter in the mapping and enter a specific connection in each mapping task that is associated with the mapping.
If you want to use a connection parameter with a specific source object, you must first select a specific connection so that you can access the source to select the source object in the Object field. After you select the source object, define the connection parameter.
If you want to use parameters for both the connection and source object, you do not need to first select a specific connection. You can just define the parameters in either order.
To define a connection parameter, click New Parameter next to the Connection property and enter the following information:
- Enter a parameter name that begins with a letter and is up to 200 characters in length, including @, #, _, and alphanumeric characters.
- In the Type property, verify that connection is selected.
- Leave the Connection Type property blank.
After you click OK, the parameter appears in the Connection property.
Source Type
Required. Select Multi Group to select an data map for a Adabas source table
Alternatively, you can define a parameter for the source object in the mapping and enter a specific source object in each mapping task that is associated with the mapping.
To define a parameter, select Parameter as the source type, and then specify the parameter in the Parameter property.
Parameter
Parameter to use for the source object. This field only appears when you select Parameter as the source type.
To parameterize a source object, click New Parameter. Enter the parameter name and then select the type multi group data object. This type of parameter can accept a set of groups. From the New Input Parameter dialog box, you can add or delete groups for the parameter.
When you create a multi-group data object parameter, the corresponding links are also created. You can use these links in the mapping to connect to downstream transformations.
Schema, Data Map, and Tables
On the Source tab, click the pencil to select a schema, a data map, or table.
Note: The Preview Data feature is not supported.
The following table describes the optional source properties under Advanced:
Property
Description
Map Schema Override
A schema name that overrides the schema name in the PowerExchange data map.
Map Name Override
A map name that overrides the name defined in the PowerExchange data map.
Database Id Override
A database ID that overrides the ID in the PowerExchange data map.
File Id Override
A file ID that overrides the ID in the PowerExchange data map.
Filter Overrides
One or more conditions that filter the source records that the PWX Reader retrieves from the source files. Based on the filter conditions, a WHERE clause is appended to the default SQL SELECT query that the PWX Reader uses to read the records.
You must use the nonrelational SQL syntax that PowerExchange supports for comparison and LIKE operators. For more information, see the "PowerExchange Nonrelational SQL" chapter in the PowerExchange Reference Manual. If the filter includes a date or timestamp column, ensure that the range of years in the column data is within the range of years for PowerExchange data filtering, as controlled by the DATERANGE statement in the DBMOVER configuration file. The default range is 1800 to 2200.
For example, to select records in which the TYPE column has a value of A or D, specify the following condition:
TYPE='A' or TYPE='D'
Maximum length of the entire SELECT statement that PowerExchange builds, including the WHERE clause with the filter conditions, is 8192 bytes.
Note: If you specify the Filter Overrides attribute and also specify an SQL Query Override attribute value that contains a filtering WHERE clause, the resulting SELECT statement contains a WHERE clause that uses the AND operator to associate the Filter Overrides filter conditions with the SQL Query Override conditions. For example:
SELECT * from schema.table WHERE Filter_Overrides_conditions AND SQL_Query_Override_conditions
SQL Query Override
An SQL statement that overrides the default SQL query that the PWX Reader uses to retrieve records from PowerExchange.
You must use the nonrelational SQL syntax that PowerExchange supports. For more information, see the "PowerExchange Nonrelational SQL" chapter in the PowerExchange Reference Manual. If the override statement includes a date or timestamp column, ensure that the range of years in the column data is within the range of years for PowerExchange data checking and filtering, as controlled by the DATERANGE statement in the DBMOVER configuration file. The default range is 1800 to 2200.
For example, to select records for the USER source table when the TYPE column has the value of A or D, specify the following query:
SELECT ID, NAME from schema.data_map.USER where TYPE=‘A’ or TYPE=‘D’;
Maximum length of the SELECT query override statement is 8192 bytes.
Note: If you specify an SQL Query Override attribute value that contains a filtering WHERE clause and also specify the Filter Overrides attribute, the resulting SELECT statement contains a WHERE clause that uses the AND operator to associate the SQL Query Override conditions with the Filter Overrides filter conditions. For example:
SELECT * from schema.table WHERE Filter_Overrides_conditions AND SQL_Query_Override_conditions
Tracing Level
The level of detail that appears in the session log for a Source transformation. Select one of the following options, which are listed in order of increasing detail:
- Terse
- Normal
- Verbose Initialization
- Verbose Data
Default is Normal.
•Fields tab. Select the fields that you want to use in the data flow. You can edit field metadata such as precision and scale if necessary.
Note: Ignore the Partitions tab. Key-range partitioning is not supported for connections.
Targets in mappings with Adabas sources
To configure a target in a mapping, click the Target transformation box on the mapping canvas. Then configure the target properties in the Target Properties panel below the mapping canvas.
Note: To determine which target types are supported by Adabas Connector, see Supported targets.
•General tab. Specify a user-defined name and optional description for the target.
•Incoming Fields tab. Optionally review the included fields and define field exclusion rules and renaming criteria.
•Target tab. Configure the target properties under Details and under Advanced. The advanced properties are available only if you enter a specific connection for the target. Target properties vary based on the supported target that you select.
•Target Fields tab. Optionally add fields, delete fields, or edit field metadata such as native data type, precision, and scale.
•Field Mapping tab. Map the incoming fields to the target fields. You can manually map an incoming field to a target field or automatically map fields based on the field names. If you entered a connection parameter, you must select Automatic as the field map option to be able to validate the mapping. Alternatively, you can completely or partially parameterize field mappings, and then define the specific field mappings in the mapping task. For more information, see Transformations > Target Transformation > Target Transformation Field Mappings in the Data Integration Help.
Adabas targets in a mapping
To write data to an Adabas database, configure an Adabas object as the Target transformation in a mapping.
Specify the name and description of the Adabas target. Configure the target and advanced properties for the target object.
The following table describes the target properties under Details:
Property
Description
Connection
Select an existing Adabas target connection, or create a target connection that has a type of Adabas.
Alternatively, you can use a connection parameter in the mapping and then enter a specific connection in each mapping task that is associated with the mapping.
If you want to use a connection parameter with a specific target object, you must first select a specific connection so that you can access the target to select the target object in the Object field. After you select the target object, define the connection parameter.
If you want to use parameters for both the connection and target object, you do not need to first select a specific connection. You can just define the parameters in either order.
To define a connection parameter, click New Parameter and enter the following information:
- Enter a parameter name that begins with a letter and is up to 200 characters in length, including @, #, _, and alphanumeric characters.
- In the Type property, verify that connection is selected.
- Leave the Connection Type property blank.
Target Type
Select Single Object if you want to specify a target table.
Object
Click Select. In the Target Object dialog box, select Existing and then select an existing target table.
Operation
Select the target operation. You can perform the following operations on an Adabas target:
- Insert
- Update
- Upsert
- Delete
- Data Driven
The following table describes the advanced target properties that appear under Advanced:
Advanced Property
Description
Forward Rejected Rows
Select this option to have the Target transformation pass rejected records to a reject file that is generated in the following directory:
Note: This field is not displayed if a parameter is specified for the target connection.
Pre SQL
SQL statements that run on the target before the extracted data is written to a target.
Maximum length is 5000 characters.
Post SQL
SQL statements that run on the target after the extracted data is written to a target.
Maximum length is 5000 characters.
Update SQL Override
Overrides the default Update SQL that is sent to PowerExchange.
Map Schema Override
A schema name that overrides the schema name in the PowerExchange data map.
Map Name Override
Overrides the target PowerExchange data map name.
Insert SQL override
Overrides the default Insert SQL sent to PowerExchange.
Delete SQL override
Overrides the default Delete SQL that is sent to PowerExchange.
Pre SQL run once per Connection
Runs the SQL that you specify in the Pre SQL attribute only once for a connection.
Select this attribute in either of the following cases:
- In the Pre SQL attribute for a session that uses writer partitioning, you specify a SQL statement such as CREATEFILE that can run only once for the session. If you do not select Pre SQL run once per Connection, the session tries to run the statement once for each partition.
- In the Pre SQL attribute for a session that performs a multiple-record write, you specify a CREATEFILE statement that creates a new generation of a GDG or creates an empty file. If you do not select Pre SQL run once per Connection, the session creates a generation or tries to create a new empty file for each record that the session writes.
.
Lookup Transformation
You can configure a cached Lookup transformation when you use an Adabas connection in a mapping to return data from an Adabas source based on a specified lookup condition. However, the Adabas Connector does not support uncached lookup.
For more information, see Cloud Data Integration Transformations.