PowerExchange Adapters
This section describes new PowerExchange adapter features in version 10.4.1.
PowerExchange Adapters for Informatica
This section describes new PowerExchange adapter features in version 10.4.1.
PowerExchange for Amazon Redshift
Effective in version 10.4.1, PowerExchange for Amazon Redshift includes the following features:
- •The Amazon S3 bucket that you specify to create staging files can be in a different region than the Amazon Redshift cluster.
- •You can use KMS Customer managed keys from an external account to create encrypted resources by providing the Amazon Resource Name (ARN) of the external account in the connection properties.
For more information, see the Informatica 10.4.1 PowerExchange for Amazon Redshift User Guide.
PowerExchange for Amazon S3
Effective in version 10.4.1, PowerExchange for Amazon S3 includes the following features:
- •You can read multiple files of the flat format type from Amazon S3 and write data to a target using a .manifest file.
- •You can use the AWS ap-east-1 (Hong Kong) region in the native environment and on the Spark engine.
- •You can use KMS Customer managed keys from an external account to create encrypted resources by providing the Amazon Resource Name (ARN) of the external account in the connection properties.
- •You can run a mapping to read an Amazon S3 binary file using the FileName port in native mode.
- •You can read an Amazon S3 binary file from a directory in native mode.
For more information, see the Informatica 10.4.1 PowerExchange for Amazon S3 User Guide.
PowerExchange for Google BigQuery
Effective in version 10.4.1, PowerExchange for Google BigQuery includes the following features:
- •You can configure a cached lookup operation on a Google BigQuery table. You can also enable lookup caching for a lookup operation to increase the lookup performance. The Data Integration Service caches the lookup source and runs the query on the rows in the cache.
- •You can read data from and write data to a table in a Google BigQuery dataset available in a specific region. Specify the Google BigQuery region in the Region ID connection property.
For more information, see the Informatica 10.4.0 PowerExchange for Google BigQuery User Guide.
PowerExchange for Hive
Effective in version 10.4.1, you can use complex data types to read and write hierarchical data from Hive tables in a mapping that runs on the Spark engine.
When you read from and write to Hive tables with hierarchical data, you can perform data preview and schema synchronization on the mapping.
You can use Hive tables with hierarchical data in HDP 3.1 distribution only.
For more information, see the Informatica 10.4.1 PowerExchange for Hive User Guide.
PowerExchange for Microsoft Azure Data Lake Storage Gen2
Effective in version 10.4.1, PowerExchange for Microsoft Azure Data Lake Storage Gen2 includes the following features:
- •You can run mappings to read and write ORC files in the following environments:
- - Native environment
- - Spark engine in the Hadoop environment
- - Databricks
- •You can run mappings to read and write JSON files in the native environment.
- •You can configure the Azure Government end-points in mappings in the native environment and on Spark engine in the Hadoop environment.
- •You can configure the authenticated proxy server settings for the Data Integration Service to connect to Microsoft Azure Data Lake Storage Gen2.
For more information, see the Informatica 10.4.1 PowerExchange for Microsoft Azure Data Lake Storage Gen2 User Guide.
PowerExchange for Salesforce
Effective in version 10.4.1, PowerExchange for Salesforce includes the following features:
- •You can use version 48.0 of Salesforce API to create a Salesforce connection and access Salesforce objects.
- •You can configure dynamic mappings to include the frequent changes to sources, targets, and transformation logic at run time based on parameters and rules that you define.
For more information, see the Informatica 10.4.1 PowerExchange for Salesforce User Guide.
PowerExchange for SAP NetWeaver
Effective in version 10.4.1, when you use the HTTP streaming data transfer mode in SAP Table Reader, you can specify a transfer packet size in MB to read data from SAP tables.
After the upgrade, existing mappings that use the HTTP streaming data transfer mode in SAP Table Reader might show a performance improvement. To further tune the performance, specify an appropriate transfer packet size based on your requirement.
For more information, see the Informatica 10.4.1 PowerExchange for SAP NetWeaver User Guide.
PowerExchange for Snowflake
Effective in version 10.4.1, you can use external tables and materialized views as Snowflake sources and perform all the read operations.
For more information, see the Informatica 10.4.1 PowerExchange for Snowflake User Guide.
PowerExchange Adapters for PowerCenter
This section describes new PowerExchange adapter features in version 10.4.1.
PowerExchange for Db2 Warehouse
Effective in version 10.4.1, PowerExchange for Db2 Warehouse includes the following features:
- •You can read and write data for the following data types in Db2 Warehouse:
- - VarcharForBitData
- - CharForBitData
- •You can use PowerExchange for Db2 Warehouse on the Windows platform. You can create Windows connections and sessions in PowerCenter installed on Windows to read from and write to Db2 Warehouse.
- •You can use PowerExchange for Db2 Warehouse on the AIX platform. You can create AIX connections and sessions in PowerCenter installed on AIX to read from and write to Db2 Warehouse.
For more information, see the Informatica PowerExchange for Db2 Warehouse 10.4.1 User Guide for PowerCenter.
PowerExchange for Greenplum
Effective in version 10.4.1, you can use PowerExchange for Greenplum on the Windows platform. You can create a Greenplum connection and sessions in PowerCenter installed on Windows to read from and write to Greenplum.
For more information, see the Informatica 10.4.1 PowerExchange for Greenplum User Guide for PowerCenter.
PowerExchange for Google BigQuery
Effective in version 10.4.1, PowerExchange for Google BigQuery includes the following features:
- •You can create a pipeline Lookup transformation to perform a lookup on a Google BigQuery table. You can retrieve data from a Google BigQuery lookup definition based on the lookup condition that you specify. A pipeline Lookup transformation has a source qualifier as the lookup source.
- •You can create a session to read the real-time or changed data from a Change Data Capture (CDC) source and load data into Google BigQuery. You must select Data Driven in the Treat source rows as property to capture changed data. You can resume the extraction of changed data from the point of interruption when a session fails or is stopped.
- •You can read data from and write data to a table in a Google BigQuery dataset available in a specific region. Specify the Google BigQuery region in the Region ID connection property.
For more information, see the Informatica 10.4.1 PowerExchange for Google BigQuery User Guide for PowerCenter.
PowerExchange for Google Cloud Storage
Effective in version 10.4.1, when you import a Google Cloud Storage target definition, you can remove the header row in a Google Cloud Storage flat file. To remove the header row in a Google Cloud Storage flat file, you can clear the Generate Header check box in the Google Cloud Storage file formatting options.
For more information, see the Informatica 10.4.1 PowerExchange for Google Cloud Storage User Guide for PowerCenter.
PowerExchange for Kafka
Effective in version 10.4.1, PowerExchange for Kafka includes the following features:
- •When you write data to a Kafka target, you can configure file-based recovery in a real-time session.
- •You can configure additional configuration properties to connect to a Kafka broker over SSL. You can configure these properties in the Additional Security Properties of the Kafka connection.
- •When you specify configuration properties in Additional Security Properties field, the value that you specify is masked.
- •You can configure PLAIN security for a Kafka broker in Additional Connection Properties or Additional Security Properties field.
For more information, see the Informatica PowerExchange for Kafka 10.4.1 User Guide for PowerCenter.
PowerExchange for Microsoft Azure SQL Data Warehouse V3
Effective in version 10.4.1, you can read data from or write data to a Microsoft Azure SQL Data Warehouse endpoint that resides in a virtual network (VNet).
For more information, see the Informatica 10.4.1 PowerExchange for Microsoft Azure SQL Data Warehouse V3 User Guide for PowerCenter.
PowerExchange for Salesforce
Effective in version 10.4.1, you can use version 48.0 of Salesforce API to create a Salesforce connection and access Salesforce objects.
For more information, see the Informatica 10.4.1 PowerExchange for Salesforce User Guide for PowerCenter.
PowerExchange for SAP NetWeaver
Effective in version 10.4.1, when you use the HTTP streaming data transfer mode in PowerExchange for SAP Dynamic ABAP Table Extractor, you can specify a transfer packet size in MB to optimize the performance of reading data from SAP tables.
For more information, see the Informatica 10.4.1 PowerExchange for SAP NetWeaver User Guide for PowerCenter.
PowerExchange for Snowflake
Effective in version 10.4.1, PowerExchange for Snowflake includes the following features:
- •You can use external tables and materialized views as Snowflake sources and perform all the read operations.
- •You can use the Update Override target session property to override the default update query that the PowerCenter Integration Service generates with the update query that you specify.
For more information, see the Informatica 10.4.1 PowerExchange for Snowflake User Guide for PowerCenter.