PowerExchange Adapters
This section describes new PowerExchange adapter features in version 10.4.0.
PowerExchange Adapters for Informatica
This section describes new Informatica adapter features in version 10.4.0.
PowerExchange for Amazon Redshift
Effective in version 10.4.0, PowerExchange for Amazon Redshift includes the following features:
- •You can run mappings in the AWS Databricks environment.
- •You can select a cluster region name in the Cluster Region connection property, even though you specify the cluster region name in the JDBC URL connection property.
- •You can retain null values when you read data from Amazon Redshift.
- •You can specify the number of staging files per batch, when you write data to Amazon Redshift.
- •You can preserve the record order, when you write data from a CDC source to an Amazon Redshift target.
For more information, see the Informatica 10.4.0 PowerExchange for Amazon Redshift User Guide.
PowerExchange for Amazon S3
Effective in version 10.4.0, PowerExchange for Amazon S3 includes the following features:
- •You can run mappings in the AWS Databricks environment.
- •You can use the temporary security credentials using AssumeRole to access the AWS resources.
- •You can parameterize the data format type and schema in the read and write operation properties at run time.
For more information, see the Informatica 10.4.0 PowerExchange for Amazon S3 User Guide.
PowerExchange for Google BigQuery
Effective in version 10.4.0, PowerExchange for Google BigQuery includes the following features:
- •You use a Google Dataproc cluster to run mappings on the Spark engine.
- •You can increase the performance of the mapping by running the mapping in Optimized Spark mode. When you use the Optimized Spark mode to read data, you can specify the number of partitions to use. You can specify whether to run the mapping in Generic or Optimized mode in the advanced read and write operation properties. Optimized Spark mode increases the mapping performance.
- •You can configure a SQL override to override the default SQL query used to extract data from the Google BigQuery source.
- •You can read or write data of the NUMERIC data type to Google BigQuery. The NUMERIC data type is an exact numeric value with 38 digits of precision and 9 decimal digits of scale. When you read or write the NUMERIC data type, the Data Integration Service maps the NUMERIC data type to the Decimal transformation data type and the allowed precision is up to 38 and scale upto 9.
For more information, see the Informatica 10.4.0 PowerExchange for Google BigQuery User Guide.
PowerExchange for Google Cloud Storage
Effective in version 10.4.0, PowerExchange for Google Cloud Storage includes the following features:
For more information, see the Informatica 10.4.0 PowerExchange for Google Cloud Storage User Guide.
PowerExchange for Microsoft Azure Blob Storage
Effective in version 10.4.0, PowerExchange for Microsoft Azure Blob Storage includes the following features:
- •You can parameterize the data format type and schema in the read and write operation properties at run time.
- •You can use shared access signatures authentication while creating a Microsoft Azure Blob Storage connection.
For more information, see the Informatica 10.4.0 PowerExchange for Microsoft Azure Blob Storage User Guide.
PowerExchange for Microsoft Azure SQL Data Warehouse
Effective in version 10.4.0, you can read data from or write data to a Microsoft Azure SQL Data Warehouse endpoint that reside in a virtual network (VNet).
For more information, see the Informatica 10.4.0 PowerExchange for Microsoft Azure SQL Data Warehouse User Guide.
PowerExchange for Salesforce
Effective in version 10.4.0, PowerExchange for Salesforce includes the following features:
- •You can use version 45.0, 46.0, and 47.0 of Salesforce API to create a Salesforce connection and access Salesforce objects.
- •You can enable primary key chunking for queries on a shared object that represents a sharing entry on the parent object. Primary key chunking is supported for shared objects only if the parent object is supported. For example, if you want to query on CaseHistory, primary key chunking must be supported for the parent object Case.
- •You can create assignment rules to reassign attributes in records when you insert, update, or upsert records for Lead and Case target objects using the standard API.
For more information, see the Informatica 10.4.0 PowerExchange for Salesforce User Guide.
PowerExchange for SAP NetWeaver
Effective in version 10.4.0, PowerExchange for SAP NetWeaver includes the following features:
- •You can configure HTTPS streaming for SAP table reader mappings.
- •You can read data from ABAP CDS views using SAP Table Reader if your SAP NetWeaver system version is 7.50 or later.
- •You can read data from SAP tables with fields that have the following data types:
- - DF16_DEC
- - DF32_DEC
- - DF16_RAW
- - DF34_RAW
- - INT8
- - RAWSTRING
- - SSTRING
- - STRING
For more information, see the Informatica PowerExchange for SAP NetWeaver 10.4.0 User Guide.
PowerExchange for Snowflake
Effective in version 10.4.0, PowerExchange for Snowflake includes the following features:
- •You can run Snowflake mappings in the Databricks environment.
- •You can use Snowflake objects as dynamic sources and targets in a mapping.
- •You can create a Snowflake target using the Create Target option.
- •You can configure a target schema strategy for a Snowflake target in a mapping. You can choose from the available options to either retain the existing target schema or create a target if it does not exist. You can also specify the target schema strategy options as a parameter value.
- •You can specify a rejected file name and path in the Snowflake advanced target properties which the Data Integration Service uses to write records that are rejected while writing to the target.
- •When the ODBC provider type in the Snowflake ODBC connection is Snowflake, you can configure pushdown optimization to push the transformation logic to the Snowflake database.
- •You can read or write data of Decimal data type of 38-bit precision and scale by configuring the EnableSDKDecimal38 custom flag in the Data Integration Service properties.
For more information, see the Informatica 10.4.0 PowerExchange for Snowflake User Guide.
PowerExchange for HDFS
Effective in version 10.4.0, PowerExchange for HDFS includes the following features:
- •You can parameterize the data format type and schema in the read and write operation properties at run time.
- •You can format the schema of a complex file data object for a read or write operation.
For more information, see the Informatica 10.4.0 PowerExchange for HDFS User Guide.
PowerExchange Adapters for PowerCenter
This section describes new PowerCenter adapter features in version 10.4.0.
PowerExchange for Google BigQuery
Effective in version 10.4.0, you can read or write data of the NUMERIC data type to Google BigQuery. The NUMERIC data type is an exact numeric value with 38 digits of precision and 9 decimal digits of scale. When you read or write the NUMERIC data type, the PowerCenter Integration Service maps the NUMERIC data type to the Decimal transformation data type and the allowed precision is up to 28 and scale upto 9.
For more information, see the Informatica 10.4.0 PowerExchange for Google BigQuery User Guide for PowerCenter.
PowerExchange for Google Cloud Storage
Effective in version 10.4.0, you can configure the following Google Cloud Storage data object read operation advanced properties when you read data from a Google Cloud Storage source:
- Google Cloud Storage Path
- Overrides the Google Cloud Storage path to the file that you selected in the Google Cloud Storage data object.
Use the following format:
gs://<bucket name> or gs://<bucket name>/<folder name>
- Source File Name
- Overrides the Google Cloud Storage source file name specified in the Google Cloud Storage data object.
- Is Directory
- Reads all the files available in the folder specified in the Google Cloud Storage Path data object read operation advanced property.
For more information, see the Informatica 10.4.0 PowerExchange for Google Cloud Storage User Guide for PowerCenter.
PowerExchange for Greenplum
Effective in version 10.4.0, you can use PowerExchange for Greenplum to read data from Greenplum. You can configure specific session properties for Greenplum sources to determine how to extract data from Greenplum.
When you run a Greenplum session to read data, the PowerCenter Integration Service invokes the Greenplum database parallel file server, gpfdist, which is Greenplum's file distribution program, to read data.
For more information, see the Informatica 10.4.0 PowerExchange for Greenplum User Guide for PowerCenter.
PowerExchange for JD Edwards EnterpriseOne
Effective in version 10.4.0, you can use version 9.2 of JD Edwards EnterpriseOne API to create a JD Edwards EnterpriseOne connection and access JD Edwards EnterpriseOne objects.
For more information, see the Informatica 10.4.0 PowerExchange for JD Edwards EnterpriseOne User Guide for PowerCenter.
PowerExchange for Kafka
Effective in version 10.4.0, you can use configure the following SSL properties to enable a secure connection to a Kafka broker:
- •SSL Mode
- •SSL TrustStore File Path
- •SSL TrustStore Password
- •SSL KeyStore File Path
- •SSL KeyStore Password
You can configure the Kafka messaging broker to use Kafka broker version 0.10.1.1 and above.
For more information, see the Informatica PowerExchange for Kafka 10.4.0 User Guide for PowerCenter.
PowerExchange for Salesforce
Effective in version 10.4.0, you can use version 46.0 and 47.0 of Salesforce API to create a Salesforce connection and access Salesforce objects.
For more information, see the Informatica 10.4.0 PowerExchange for Salesforce User Guide for PowerCenter.
PowerExchange for SAP NetWeaver
Effective in version 10.4.0, you can use PowerExchange for SAP Dynamic ABAP Table Extractor to read data from SAP tables and ABAP Core Data Services (CDS) views through HTTP/HTTPS streaming. You can read data from ABAP CDS views using PowerExchange for SAP Dynamic ABAP Table Extractor if your SAP NetWeaver system version is 7.50 or later.
For more information, see the Informatica PowerExchange for SAP NetWeaver 10.4.0 User Guide.