PowerExchange Adapters
This section describes new PowerExchange adapter features in version 9.6.1.
Informatica Adapters
This section describes new Informatica adapter features.
- PowerExchange for DataSift
- You can extract historical data from DataSift for Twitter sources.
- For more information, see the Informatica PowerExchange for DataSift 9.6.1 User Guide.
- PowerExchange for Greenplum
- - You can use PowerExchange for Greenplum to load large volumes of data into Greenplum tables. You can run mappings developed in the Developer tool. You can run the mappings in native or Hive run-time environments.
- - You can also use PowerExchange for Greenplum to load data to a HAWQ database in bulk.
- For more information, see the Informatica PowerExchange for Greenplum 9.6.1 User Guide.
- PowerExchange for LinkedIn
- You can extract information about a group, information about posts of a group, comments about a group post, and comments about specific posts from LinkedIn. You can also extract a list of groups suggested for the user and a list of groups in which the user is a member from LinkedIn.
- For more information, see the Informatica PowerExchange for LinkedIn 9.6.1 User Guide.
- PowerExchange for HBase
- You can use PowerExchange for HBase to read data in parallel from HBase. The Data Integration Service creates multiple Map jobs to read data in parallel.
- For more information, see the Informatica PowerExchange for HBase 9.6.1 User Guide.
- PowerExchange for Hive
- You can create a Hive connection that connects to HiveServer or HiveServer2. Previously, you could create a Hive connection that connects to HiveServer. HiveServer2 supports Kerberos authentication and concurrent connections.
- For more information, see the Informatica PowerExchange for Hive 9.6.1 User Guide.
- PowerExchange for MongoDB
- You can use the Schema Editor to change the schema of MongoDB collections. You can also use virtual tables for MongoDB collections that have nested columns.
- For more information, see the Informatica PowerExchange for MongoDB 9.6.1 User Guide.
- PowerExchange for Teradata Parallel Transporter API
- When you load data to a Teradata table in a Hive run-time environment, you can use the Teradata Connector for Hadoop (TDCH) to increase performance. To use TDCH to load data, add the EnableTdch custom property at the Data Integration Service level and set its value to true.
- For more information, see the Informatica PowerExchange for Teradata Parallel Transporter API 9.6.1 User Guide.
PowerCenter Adapters
This section describes new PowerCenter adapter features.
- PowerExchange for LDAP
- In the session properties, you can specify the path and name of the file that contains multiple filter conditions to query the LDAP entries.
- For more information, see the Informatica PowerExchange for LDAP 9.6.1 User Guide for PowerCenter.
- PowerExchange for MongoDB
- You can use the Schema Editor to change the schema of MongoDB collections. You can also use virtual tables for MongoDB collections that have nested columns.
- For more information, see the Informatica PowerExchange for MongoDB 9.6.1 User Guide for PowerCenter.
- PowerExchange for Netezza
- When you use bulk mode to read data from or write data to Netezza, you can override the table name and schema name in the session properties.
- For more information, see the Informatica PowerExchange for Netezza 9.6.1 User Guide for PowerCenter.
- PowerExchange for Salesforce
- - You can configure a session to use the Salesforce Bulk API to read data in bulk from a Salesforce source.
- - You can dissociate a custom child object from a standard parent object.
- For more information, see the Informatica PowerExchange for Salesforce 9.6.1.0.1 User Guide for PowerCenter.
- PowerExchange for SAP NetWeaver
- - When you run a file mode session to read data from SAP through ABAP, you can configure the FileCompressEnable custom property to enable compressed data transfer. When you compress data, you can increase the session performance and decrease the disk storage that the staging file needs.
- - The Source_For_BCI relational target in the BCI listener mapping that Informatica ships contains a new column called DataSourceName. You can use this field to partition the data that the Source_For_BCI relational target receives from SAP.
- - Informatica ships an activation mapping along with the BCI_Mappings.xml file. You can use the activation mapping to activate multiple DataSources in SAP simultaneously.
- - When you use numeric delta pointers to extract business content data, you can extract the changed data alone without doing a full transfer of the entire data.
- For more information, see the Informatica PowerExchange for SAP NetWeaver 9.6.1 User Guide for PowerCenter.