PowerExchange Adapters
This section describes new PowerExchange adapter features in version 10.0.
PowerExchange Adapters for Informatica
This section describes new Informatica adapter features in version 10.0.
PowerExchange for DataSift
Effective in version 10.0, you can parameterize the DataSift data object read operation properties.
For more information, see the Informatica PowerExchange for DataSift 10.0 User Guide.
PowerExchange for Facebook
Effective in version 10.0, you can parameterize the Facebook data object read operation properties.
For more information, see the Informatica PowerExchange for Facebook 10.0 User Guide.
PowerExchange for Greenplum
Effective in version 10.0, you can perform the following tasks with PowerExchange for Greenplum:
- •You can configure dynamic partitioning for Greenplum data objects. You can configure the partition information so that the Data Integration Service determines the number of partitions to create at run time.
- •You can parameterize Greenplum data object operation properties to override the write data object operation properties during run time.
- •You can use the Max_Line_Length integer to specify the maximum length of a line in the XML transformation data that is passed to gpload.
For more information, see the Informatica PowerExchange for Greenplum 10.0 User Guide.
PowerExchange for HBase
Effective in version 10.0, you can parameterize the HBase data object read and write operation properties.
For more information, see the Informatica PowerExchange for HBase 10.0 User Guide.
PowerExchange for HDFS
Effective in version 10.0, you can parameterize the complex file data object read and write operation properties.
For more information, see the Informatica PowerExchange for HDFS 10.0 User Guide.
PowerExchange for LinkedIn
Effective in version 10.0, you can parameterize the LinkedIn data object read operation properties.
For more information, see the Informatica PowerExchange for LinkedIn 10.0 User Guide.
PowerExchange for SAP NetWeaver
Effective in version 10.0, you can perform the following tasks with PowerExchange for SAP NetWeaver:
- •You can use the Developer tool to create an SAP Table data object and a data object read operation. You can then add the read operation as a source or lookup in a mapping, and run the mapping to read or look up data from SAP tables.
- •When you read data from SAP tables, you can configure key range partitioning. You can also use parameters to change the connection and Table data object read operation properties at run time.
- •You can run a profile against SAP Table data objects.
- •When you create an SQL Data Service, you can add an SAP Table data object read operation as a virtual table.
- •You can read data from the SAP BW system through an open hub destination or InfoSpoke.
- •When you read data from the SAP BW system, you can configure dynamic or fixed partitioning. You can also use parameters to change the connection and BW OHS Extract data object read operation properties at run time.
- •You can write data to the SAP BW system. You can use a 3.x data source or a 7.x data source to write data to the SAP BW system.
- •When you write data to the SAP BW system, you can configure dynamic partitioning. You can also use parameters to change the connection and BW Load data object write operation properties at run time.
- •You can create an SAP connection in the Administrator tool.
- •When you use the Developer tool to read data from or write data to SAP BW, you can create an SAP BW Service in the Administrator tool.
For more information, see the Informatica PowerExchange for SAP NetWeaver 10.0 User Guide.
PowerExchange for Teradata Parallel Transporter API
Effective in version 10.0, you can perform the following tasks with PowerExchange for Teradata Parallel Transporter API:
- •You can use PowerExchange for Teradata Parallel Transporter API to read large volumes of data from Teradata tables.
- •You can use the Update system operator to perform insert, update, upsert, and delete operations against Teradata database tables.
- •You can use the Secure Sockets Layer (SSL) protocol to configure a secure connection between the Developer tool and the Teradata database.
- •You can configure dynamic partitioning for Teradata Parallel Transporter API data objects. You can configure the partition information so that the Data Integration Service determines the number of partitions to create at run time.
- •You can parameterize Teradata data object operation properties to override the read and write data object operation properties during run time.
For more information, see the Informatica PowerExchange for Teradata Parallel Transporter API 10.0 User Guide.
PowerExchange for Twitter
Effective in version 10.0, you can parameterize the read operation properties for Twitter and Twitter Streaming data objects.
For more information, see the Informatica PowerExchange for Twitter 10.0 User Guide.
PowerExchange for Web Content-Kapow Katalyst
Effective in version 10.0, you can parameterize the Web Content-Kapow Katalyst data object read operation properties.
For more information, see the Informatica PowerExchange for Web Content-Kapow Katalyst 10.0 User Guide.