Command | Description |
---|---|
clearConfigurationProperties | Clears overridden property values in the cluster configuration set. |
createConfiguration | Creates a new cluster configuration either from XML files or remote cluster manager. |
deleteConfiguration | Deletes a cluster configuration from the domain. |
exportConfiguration | Exports a cluster configuration to a compressed file or a combined XML file. |
listAssociatedConnections | Lists connections by type that are associated with the specified cluster configuration. |
listConfigurationGroupPermissions | Lists the permissions that a group has for a cluster configuration. |
listConfigurationSets | Lists configuration sets in the cluster configuration. |
listConfigurationProperties | Lists configuration properties in the cluster configuration set. |
listConfigurations | Lists cluster configuration names. |
listConfigurationUserPermissions | Lists the permissions that a user has for a cluster configuration. |
refreshConfiguration | Refreshes a cluster configuration either from XML files or remote cluster manager. |
setConfigurationPermissions | Sets permissions on cluster configuration to a user or a group after removing previous permissions. |
setConfigurationProperties | Sets overridden property values in the cluster configuration set. |
Command | Description |
---|---|
ExecutionOptions.MaxHadoopBatchExecutionPoolSize | The maximum number of deployed Hadoop jobs that can run concurrently. |
ExecutionOptions.MaxNativeBatchExecutionPoolSize | The maximum number of deployed native jobs that each Data Integration Service process can run concurrently. |
ExecutionOptions.MaxOnDemandExecutionPoolSize | The maximum number of on-demand jobs that can run concurrently. Jobs include data previews, profiling jobs, REST and SQL queries, web service requests, and mappings run from the Developer tool. |
WorkflowOrchestrationServiceOptions.MaxWorkerThreads | The maximum number of threads that the Data Integration Service can use to run parallel tasks between a pair of inclusive gateways in a workflow. The default value is 10. If the number of tasks between the inclusive gateways is greater than the maximum value, the Data Integration Service runs the tasks in batches that the value specifies. |
Command | Description |
---|---|
genReuseReportFromPC | Contains the following new option: -BlockSize: Optional. The number of mappings that you want to run the infacmd ipc genReuseReportFromPC command against. |
Command | Description |
---|---|
createConnection | Defines a connection and the connection options. Added, changed, and removed Hadoop connection options. See infacmd isp createConnection. |
getDomainSamlConfig | Renamed from getSamlConfig. Returns the value of the cst option set for Secure Assertion Markup Language (SAML) authentication. Specifies the allowed time difference between the Active Directory Federation Services (AD FS) host system clock and the system clock on the master gateway node. |
getUserActivityLog | Returns user activity log data, which now includes successful and unsuccessful user login attempts from Informatica clients. The user activity data includes the following properties for each login attempt from an Informatica client:
If the client sets custom properties on login requests, the data includes the custom properties. |
listConnections | Lists connection names by type. You can list by all connection types or filter the results by one connection type. The -ct option is now available for the command. Use the -ct option to filter connection types. |
purgeLog | Purges log events and database records for license usage. The -lu option is now obsolete. |
SwitchToGatewayNode | The following options are added for configuring SAML authentication:
|
Option | Description |
---|---|
clusterConfigId | The cluster configuration ID associated with the Hadoop cluster. |
blazeJobMonitorURL | The host name and port number for the Blaze Job Monitor. |
rejDirOnHadoop | Enables hadoopRejDir. Used to specify a location to move reject files when you run mappings. |
hadoopRejDir | The remote directory where the Data Integration Service moves reject files when you run mappings. Enable the reject directory using rejDirOnHadoop. |
sparkEventLogDir | An optional HDFS file path of the directory that the Spark engine uses to log events. |
sparkYarnQueueName | The YARN scheduler queue name used by the Spark engine that specifies available resources on a cluster. |
Current Name | Previous Name | Description |
---|---|---|
blazeYarnQueueName | cadiAppYarnQueueName | The YARN scheduler queue name used by the Blaze engine that specifies available resources on a cluster. The name is case sensitive. |
blazeExecutionParameterList | cadiExecutionParameterList | Custom properties that are unique to the Blaze engine. |
blazeMaxPort | cadiMaxPort | The maximum value for the port number range for the Blaze engine. |
blazeMinPort | cadiMinPort | The minimum value for the port number range for the Blaze engine. |
blazeUserName | cadiUserName | The owner of the Blaze service and Blaze service logs. |
blazeStagingDirectory | cadiWorkingDirectory | The HDFS file path of the directory that the Blaze engine uses to store temporary files. |
hiveStagingDatabaseName | databaseName | Namespace for Hive staging tables. |
impersonationUserName | hiveUserName | Hadoop impersonation user. The user name that the Data Integration Service impersonates to run mappings in the Hadoop environment. |
sparkStagingDirectory | SparkHDFSStagingDir | The HDFS file path of the directory that the Spark engine uses to store temporary files for running jobs. |
Option | Description |
---|---|
RMAddress | The service within Hadoop that submits requests for resources or spawns YARN applications. Imported into the cluster configuration as the property yarn.resourcemanager.address. |
defaultFSURI | The URI to access the default Hadoop Distributed File System. Imported into the cluster configuration as the property fs.defaultFS or fs.default.name. |
Option | Description |
---|---|
metastoreDatabaseDriver* | Driver class name for the JDBC data store. |
metastoreDatabasePassword* | The password for the metastore user name. |
metastoreDatabaseURI* | The JDBC connection URI used to access the data store in a local metastore setup. |
metastoreDatabaseUserName* | The metastore database user name. |
metastoreMode* | Controls whether to connect to a remote metastore or a local metastore. |
remoteMetastoreURI* | The metastore URI used to access metadata in a remote metastore setup. This property is imported into the cluster configuration as the property hive.metastore.uris. |
jobMonitoringURL | The URL for the MapReduce JobHistory server. |
* These properties are deprecated in 10.2. When you upgrade to 10.2, the property values you set in a previous release are saved in the repository, but they do not appear in the connection properties. |
Property | Description |
---|---|
ZOOKEEPERHOSTS | Name of the machine that hosts the ZooKeeper server. |
ZOOKEEPERPORT | Port number of the machine that hosts the ZooKeeper server. |
ISKERBEROSENABLED | Enables the Informatica domain to communicate with the HBase master server or region server that uses Kerberos authentication. |
hbaseMasterPrincipal | Service Principal Name (SPN) of the HBase master server. |
hbaseRegionServerPrincipal | Service Principal Name (SPN) of the HBase region server. |
Property | Description |
---|---|
defaultFSURI | The URI to access the default Hadoop Distributed File System. |
jobTrackerURI | The service within Hadoop that submits the MapReduce tasks to specific nodes in the cluster. |
hiveWarehouseDirectoryOnHDFS | The absolute HDFS file path of the default database for the warehouse that is local to the cluster. |
metastoreExecutionMode | Controls whether to connect to a remote metastore or a local metastore. |
metastoreDatabaseURI | The JDBC connection URI used to access the data store in a local metastore setup. |
metastoreDatabaseDriver | Driver class name for the JDBC data store. |
metastoreDatabaseUserName | The metastore database user name. |
metastoreDatabasePassword | The password for the metastore user name. |
remoteMetastoreURI | The metastore URI used to access metadata in a remote metastore setup. This property is imported into the cluster configuration as the property hive.metastore.uris. |
Command | Description |
---|---|
manageGroupPermissionOnProject | Manages permissions on multiple projects for a group. |
manageUserPermissionOnProject | Manages permissions on multiple projects for a user. |
upgradeExportedObjects | Upgrades objects exported to an .xml file from a previous Informatica release to the current metadata format. The command generates an .xml file that contains the upgraded objects. |
Command | Description |
---|---|
GetMappingStatus | Gets the current status of a mapping job by job ID. |
Command | Description |
---|---|
completeTask | Completes a Human task instance that you specify. |
delegateTask | Assigns ownership of a Human task instance to a user or group. |
listTasks | Lists the Human task instances that meet the filter criteria that you specify. |
releaseTask | Releases a Human task instance from the current owner, and returns ownership of the task instance to the business administrator that the workflow configuration identifies. |
startTask | Changes the status of a Human task instance to IN_PROGRESS. |
Command | Description |
---|---|
DefineDomain | The following options are added for configuring Secure Assertion Markup Language (SAML) authentication:
|
DefineGatewayNode | The following options are added for configuring SAML authentication:
|
UpdateDomainSamlConfig | Renamed from UpdateSamlConfig. The following option is added for configuring SAML authentication:
|
UpdateGatewayNode | The following options are added for configuring SAML authentication.
|
Command | Description |
---|---|
CreateQuery | Creates a query in the repository. |
DeleteQuery | Deletes a query from the repository. |
Command | Description |
---|---|
CreateConnection | Contains the following updated option: -w. Enables you to use a parameter in the password option. |
ListObjectDependencies | Contains the following updated option: -o. The object type list includes query and deploymentgroup. |
UpdateConnection | Contains the following updated options: -w. Enables you to use a parameter in the password option. -x. Disables the use of password parameters if you use the parameter in password. |