Mappings > Parameters > Parameter files
  

Parameter files

A parameter file is a list of user-defined parameters and their associated values.
Use a parameter file to define values that you want to update without having to edit the task. You update the values in the parameter file instead of updating values in a task. The parameter values are applied when the task runs.
You can use a parameter file to define parameter values in the following tasks:
Mapping tasks
Define parameter values for connections in the following transformations:
Define parameter values for objects in the following transformations:
Also, define values for parameters in data filters, expressions, and lookup expressions.
Note: Not all connectors support parameter files. To see if a connector supports runtime override of connections and data objects, see the help for the appropriate connector.
Synchronization tasks
Define values for parameters in data filters, expressions, and lookup expressions.
PowerCenter tasks
Define values for parameters and variables in data filters, expressions, and lookup expressions.
You enter the parameter file name and location when you configure the task.
You can't use a parameter file in a mapping task that is based on a mapping in SQL ELT mode.

Parameter file requirements

You can reuse parameter files across assets such as mapping tasks, taskflows, and linear taskflows. To reuse a parameter file, define local and global parameters within a parameter file.
You group parameters in different sections of the parameter file. Each section is preceded by a heading that identifies the project, folder, and asset to which you want to apply the parameter values. You define parameters directly below the heading, entering each parameter on a new line.
The following table describes the headings that define each section in the parameter file and the scope of the parameters that you define in each section:
Heading
Description
#USE_SECTIONS
Tells Data Integration that the parameter file contains asset-specific parameters. Use this heading as the first line of a parameter file that contains sections. Otherwise Data Integration reads only the first global section and ignores all other sections.
[Global]
Defines parameters for all projects, folders, tasks, taskflows, and linear taskflows.
[project name].[folder name].[taskflow name]
-or-
[project name].[taskflow name]
Defines parameters for tasks in the named taskflow only.
If a parameter is defined in a taskflow section and in a global section, the value in the taskflow section overrides the global value.
[project name].[folder name].[linear taskflow name]
-or-
[project name].[linear taskflow name]
Defines parameters for tasks in the named linear taskflow only.
If a parameter is defined in a linear taskflow section and in a global section, the value in the linear taskflow section overrides the global value.
[project name].[folder name].[task name]
-or-
[project name].[task name]
Defines parameters for the named task only.
If a parameter is defined in a task section and in a global section, the value in the task section overrides the global value.
If a parameter is defined in a task section and in a taskflow or linear taskflow section and the taskflow uses the task, the value in the task section overrides the value in the taskflow section.
If the parameter file does not contain sections, Data Integration reads all parameters as global.
Precede the parameter name with two dollar signs, as follows: $$<parameter>. Define parameter values as follows:
$$<parameter>=value
$$<parameter2>=value2
For example, you have the parameters SalesQuota and Region. In the parameter file, you define each parameter in the following format:
$$SalesQuota=1000
$$Region=NW
The parameter value includes any characters after the equals sign (=), including leading or trailing spaces. Parameter values are treated as String values.

Parameter scope

When you define values for the same parameter in multiple sections in a parameter file, the parameter with the smallest scope takes precedence over parameters with larger scope.
In this case, Data Integration gives precedence to parameter values in the following order:
  1. 1Values defined in a task section.
  2. 2Values defined in a taskflow or linear taskflow section.
  3. 3Values defined in the #USE_SECTIONS section.
  4. 4Values defined in a global section.
For example, a parameter file contains the following parameter values:
[GLOBAL]
$$connection=ff5
[Project1].[Folder1].[monthly_sales]
$$connection=ff_jd
For the task "monthly_sales" in Folder1 inside Project1, the value for parameter $$connection is "ff_jd." In all other tasks, the value for $$connection is "ff5."
If you define a parameter in a task section and in a taskflow or linear taskflow section and the taskflow uses the task, Data Integration uses the parameter value defined in the task section.
For example, you define the following parameter values in a parameter file:
#USE_SECTIONS
$$source=customer_table
[GLOBAL]
$$location=USA
$$sourceConnection=Oracle
[Default].[Sales].[Task1]
$$source=Leads_table
[Default].[Sales].[Taskflow2]
$$source=Revenue
$$sourceconnection=ODBC_1
[Default].[Taskflow3]
$$source=Revenue
$$sourceconnection=Oracle_DB
Task1 contains the $$location, $$source, and $$sourceconnection parameters. Taskflow2 and Taskflow3 contain Task1.
When you run Taskflow2, Data Integration uses the following parameter values:
Parameter
Section
Value
$$source
[Default].[Sales].[Task1]
Leads_table
$$sourceconnection
[Default].[Sales].[Taskflow2]
ODBC_1
$$location
[GLOBAL]
USA
When you run Taskflow3, Data Integration uses the following parameter values:
Parameter
Section
Value
$$source
[Default].[Sales].[Task1]
Leads_table
$$sourceconnection
[Default].[Taskflow3]
Oracle_DB
$$location
[GLOBAL]
USA
When you run Task1, Data Integration uses the following parameter values:
Parameter
Section
Value
$$source
[Default].[Sales].[Task1]
Leads_table
$$sourceconnection
[GLOBAL]
Oracle
$$location
[GLOBAL]
USA
For all other tasks that contain the $$source parameter, Data Integration uses the value customer_table.

Sample parameter file

The following example shows a sample parameter file entry:
#USE_SECTIONS
$$oracleConn=Oracle_SK
$$city=SF
[Global]
$$ff_conn=FF_ja_con
$$st=CA
[Default].[Accounts].[April]
$$QParam=SELECT * from con.ACCOUNT where city=LAX
$$city=LAX
$$tarOb=accounts.csv
$$oracleConn=Oracle_Src

Parameter file location

When you use a parameter file, save the parameter file on a local machine or in a cloud-hosted directory based on the task type. You enter details about the parameter file on the Runtime Options tab when you create the task.
The following table lists the default parameter file directory for each task type:
Task type
Default parameter file directory
Mapping task based on a mapping
<Secure Agent installation directory>/apps/Data_Integration_Server/data/userparameters
Mapping task based on a mapping in advanced mode
<Secure Agent installation directory>/apps/data/userparameters
Synchronization task
<Secure Agent installation directory>/apps/Data_Integration_Server/data/userparameters
For mapping tasks, you can also save the parameter file in one of the following locations:
A local machine
Save the file in a location that the Secure Agent can access. For mappings in advanced mode, the file must be stored on the Secure Agent machine.
You enter the file name and directory on the Runtime Options tab when you create the task. Enter the absolute file path. Alternatively, enter a path relative to a $PM system variable, for example, $PMRootDir/ParameterFiles.
The following table lists the system variables that you can use:
System variable
Description
$PMRootDir
Root directory for the Data Integration Server Secure Agent service.
Default is <Secure Agent installation directory>/apps/Data_Integration_Server/data.
$PMBadFileDir
Directory for row error logs and reject files.
Default is $PMRootDir/error.
$PMCacheDir
Directory for index and data cache files.
Default is $PMRootDir/cache.
$PMExtProcDir
Directory for external procedures.
Default is $PMRootDir.
$PMLookupFileDir
Directory for lookup files.
Default is $PMRootDir.
$PMSessionLogDir
Directory for session logs.
Default is $PMRootDir/../logs.
$PMSourceFileDir
Directory for source files.
Default is $PMRootDir.
$PMStorageDir
Directory for files related to the state of operation of internal processes such as session and workflow recovery files.
Default is $PMRootDir.
$PMTargetFileDir
Directory for target files.
Default is $PMRootDir.
$PMTempDir
Directory for temporary files.
Default is $PMRootDir/temp.
$PMWorkflowLogDir
Directory for workflow logs.
Default is $PMRootDir/../logs.
To find the configured path of a system variable, see the pmrdtm.cfg file located at the following directory:
<Secure Agent installation directory>\apps\Data_Integration_Server\<Data Integration Server version>\ICS\main\bin\rdtm
You can also find the configured path of any variable except $PMRootDir in the Data Integration Server system configuration details in Administrator.
If you do not enter a location, Data Integration uses the default parameter file directory.
A cloud platform
You can use a connection stored with Informatica Intelligent Cloud Services. The following table shows the connection types that you can use and the configuration requirements for each connection type:
Connection type
Requirements
Amazon S3 V2
You can use a connection that was created with the following credentials:
  • - Access Key
  • - Secret Key
  • - Region
The S3 bucket must be public.
Azure Data Lake Store Gen2
You can use a connection that was created with the following credentials:
  • - Account Name
  • - ClientID
  • - Client Secret
  • - Tenant ID
  • - File System Name
  • - Directory Path
The storage point must be public.
Google Storage V2
You can use a connection that was created with the following credentials:
  • - Service Account ID
  • - Service Account Key
  • - Project ID
The storage bucket must be public.
Create the connection before you configure the task. You select the connection and file object to use on the Runtime Options tab when you create the task.
Data Integration displays the location of the parameter file and the value of each parameter in the job details after you run the task.

Rules and guidelines for parameter files

Data Integration uses the following rules to process parameter files:

Parameter file templates

You can generate and download a parameter file template that contains mapping parameters and their default values. The parameter file template includes input and in-out parameters that can be overridden at runtime. Save the parameter file template and use it to apply parameter values when you run the task, or copy the mapping parameters to another parameter file.
When you generate a parameter file template, the file contains the default parameter values from the mapping on which the task is based. If you do not specify a default value when you create the parameter, the value for the parameter in the template is blank.
The parameter file template does not contain the following elements:
If you add, edit, or delete parameters in the mapping, download a new parameter file template.

Downloading a parameter file template

    1On the Runtime Options tab in the mapping task, click Download Parameter File Template.
    The file name is <mapping task name>.param.
    2If you want to use the file in subsequent task runs, save the parameter file in a location that is accessible by the Secure Agent.
    Enter the file name and directory on the Runtime Options tab when you configure the task.

Overriding connections with parameter files

If you use a connection parameter in a mapping, you can override the connection defined in the mapping task at runtime with values specified in a parameter file.
When you define a connection value in a parameter file, the connection type must be the same as the default connection type in the mapping task. For example, you create a Flat File connection parameter and use it as the source connection in a mapping. In the mapping task, you provide a flat file default connection. In the parameter file, you can only override the connection with another flat file connection.
When you override an FTP connection using a parameter, the file local directory must the same.
You cannot use a parameter file to override a lookup with an FTP/SFTP connection.
Note: Some connectors support only cached lookups. To see which type of lookup a connector supports, see the help for the appropriate connector.
    1In the mapping, create an input parameter:
    1. aSelect connection for the parameter type .
    2. bSelect Allow parameter to be overridden at runtime.
    2In the mapping, use the parameter as the connection that you want to override.
    3In the mapping task, define the parameter details:
    1. aSelect a default connection.
    2. bOn the Runtime Options tab, enter the parameter file directory and parameter file name.
    4In the parameter file, define the connection parameter with the value that you want to use at runtime.
    Precede the parameter name with two dollar signs ($$). For example, you have a parameter with the name ConParam and you want to override it with the connection OracleCon1. You define the runtime value with the following format:
    $$ConParam=OracleCon1
    5If you want to change the connection, update the parameter value in the parameter file.

Overriding data objects with parameter files

If you use a data object parameter in a mapping, you can override the object defined in the mapping task at runtime with values specified in a parameter file.
Note: You cannot override source objects when you read from multiple relational objects or from a file list. You cannot override target objects if you create a target at run time.
When you define an object parameter in the parameter file, the parameter in the file must have the same metadata as the default parameter in the mapping task. For example, if you override the source object ACCOUNT with EMEA_ACCOUNT, both objects must contain the same fields and the same data types for each field.
    1In the mapping, create an input parameter:
    1. aSelect data object for the parameter type.
    2. bSelect Allow parameter to be overridden at runtime.
    2In the mapping, use the object parameter at the object that you want to override.
    3In the mapping task, define the parameter details:
    1. aSet the type to Single.
    2. bSelect a default data object.
    3. cOn the Runtime Options tab, enter the parameter file directory and file name.
    4In the parameter file, specify the object to use at runtime. Perform one of the following tasks:
    5If you want to change the object, update the parameter value in the parameter file.

Overriding source queries

If you use a source query or filter condition in a mapping, you can override the value specified in the mapping task with values specified in a parameter file. You can override source queries for relational and ODBC database connections.
When you define an SQL query, the fields in the overridden query must be the same as the fields in the default query. The task fails if the query in the parameter file contains fewer fields or is invalid.
If a filter condition parameter is not resolved in the parameter file, Data Integration will use the parameter as the filter value and the task returns zero rows.
    1In the mapping, create a data object parameter.
    2Select Allow parameter to be overridden at runtime.
    3Use the parameter as the source object.
    4In the mapping task, on the Sources tab, select Query as the source type.
    5Enter a default custom query.
    6On the Runtime Options tab, provide the parameter file name and location.
    7In the parameter file, enter the values to use when the task runs.
    8If you want to change the query, update the parameter value in the parameter file.

Creating target objects at run time with parameter files

If you use a target object parameter in a mapping, you can create a target at run time using a parameter file.
You include the target object parameter and the name that you want to use in a parameter file. If the target name in the parameter file doesn't exist, Data Integration creates the target at run time. In subsequent runs, Data Integration uses the existing target.
To create a target at run time using a parameter file, the following conditions must be true:
For file storage-based connections, the parameter value in the parameter file can include both the path and file name. If the path is not specified, the target is created in the default path specified in the connection.
    1In the mapping, create an input parameter:
    1. aSelect data object for the parameter type.
    2. bSelect Allow parameter to be overridden at runtime.
    2In the mapping, use the parameter as the target object.
    3In the task, define the parameter details:
    1. aSet the type to Single.
    2. bSelect a default data object.
    3. cOn the Runtime Options tab, enter the parameter file directory and file name.
    4In the parameter file, specify the name of the target object that you want to create.
    Precede the parameter name with two dollar signs ($$). For example, you have a parameter with the name TargetObjParam and you want to create a target object with the name MyTarget. You define the runtime value with the following format:
    $$TargetObjParam=MyTarget
    For file storage-based connector types, include the path in the object name. If you don't include a path, the target is created in the default path specified in the connection.
    5If you want to change the object, update the parameter value in the parameter file.