Support Changes
This section describes the support changes in version 10.4.0.
PowerCenter
This section describes PowerCenter support changes in version 10.4.0.
Connectivity
This section describes the connectivity support changes in version 10.4.0.
Licensing for SAP HANA
Effective in version 10.4.0, you need an SAP HANA license to read data from SAP HANA sources and write data to SAP HANA targets.
In the ODBC connection, if the ODBC subtype is not set to SAP HANA and the SAP HANA license is not available, sessions fail at run time.
For more information, see the Informatica 10.4.0 PowerCenter Designer Guide.
PowerExchange Adapters for Informatica
This section describes support changes to Informatica adapters in 10.4.0.
PowerExchange for SAP NetWeaver
Effective in version 10.4.0, PowerExchange for SAP NetWeaver includes the following changes:
- •Informatica dropped support for non-Unicode transports.
Previously, Informatica supported non-Unicode transports.
- •Informatica ships transports for SAP Unicode versions 5.0 and later in the following folders:
- - Unicode cofiles: Informatica installer zip file/saptrans/mySAP/cofiles
- - Unicode data files: Informatica installer zip file/saptrans/mySAP/data
Previously, Informatica packaged the transports for SAP Unicode versions 5.0 and later in the following folders:
- - Unicode cofiles: Informatica installer zip file/saptrans/mySAP/UC/cofiles
- - Unicode data files: Informatica installer zip file/saptrans/mySAP/UC/data
For more information, see the PowerExchange for SAP NetWeaver 10.4.0 Transport Versions Installation Notice.
PowerExchange Adapters for PowerCenter
This section describes support changes to PowerCenter adapters in 10.4.0.
PowerExchange for SAP NetWeaver
Effective in version 10.4.0, PowerExchange for SAP NetWeaver includes the following changes:
- •Informatica dropped support for non-Unicode transports.
Previously, Informatica supported non-Unicode transports.
- •Informatica ships transports for SAP Unicode versions 5.0 and later in the following folders:
- - Unicode cofiles: Informatica installer zip file/saptrans/mySAP/cofiles
- - Unicode data files: Informatica installer zip file/saptrans/mySAP/data
Previously, Informatica packaged the transports for SAP Unicode versions 5.0 and later in the following folders:
- - Unicode cofiles: Informatica installer zip file/saptrans/mySAP/UC/cofiles
- - Unicode data files: Informatica installer zip file/saptrans/mySAP/UC/data
- •Informatica dropped support for reading data from SAP tables through HTTP/HTTPS streaming with PowerExchange for SAP NetWeaver. Use PowerExchange for SAP Dynamic ABAP Table Extractor to read data from SAP tables through HTTP/HTTPS streaming.
Previously, Informatica supported reading data from SAP tables through HTTP/HTTPS streaming with PowerExchange for SAP NetWeaver.
For more information, see the Informatica 10.4.0 PowerExchange for SAP NetWeaver User Guide and PowerExchange for SAP NetWeaver 10.4.0 Transport Versions Installation Notice.
Dropped Support
Effective in version 10.4.0, Informatica dropped support for Solaris. If you are using Solaris, Informatica requires you to upgrade to use a supported operating system.
For more information about how to upgrade to a supported operating system, see the Informatica 10.4.0 upgrade guides. For information about supported operating systems, see the Product Availability Matrix on the Informatica Network:
Technical Preview Support
Technical Preview Initiated
Effective in version 10.4.0, Informatica includes the following functionalities for technical preview:
- Connecting to a blockchain network
You can run mappings on the Spark engine that read data from or write to a blockchain network.
- Databricks delta table as streaming mapping target
- You can use Databricks delta table as a target of streaming mapping for the ingestion of streaming data.
- Python transformation on Databricks
- You can include the Python transformation in batch mappings configured to run on the Databricks Spark engine.
- Snowflake as a streaming mapping target
- You can configure Snowflake as a target in a streaming mapping to write data to Snowflake.
- Dynamic streaming mapping
- You can configure dynamic streaming mappings to change Kafka sources and targets at run time based on the parameters and rules that you define in a Confluent Schema Registry.
Technical preview functionality is supported for evaluation purposes but is unwarranted and is not production-ready. Informatica recommends that you use in non-production environments only. Informatica intends to include the preview functionality in an upcoming release for production use, but might choose not to in accordance with changing market or technical circumstances. For more information, contact Informatica Global Customer Support.
Technical Preview Lifted
Effective in version 10.4.0, the following functionalities are lifted from technical preview:
- Hierarchical data preview
- You can preview hierarchical data within a mapping from the Developer tool for mappings configured to run with Amazon EMR, Cloudera CDH, and Hortonworks HDP. Previewing hierarchical data in mappings configured to run with Azure HDInsight and MapR is still available for technical preview.
- PowerExchange for Amazon S3
- You can use intelligent structure models when importing a data object.
- PowerExchange for Microsoft Azure Cosmos DB SQL API
- You can develop and run mappings in the Azure Databricks environment.
- PowerExchange for Microsoft Azure SQL Data Warehouse
- You can create and run dynamic mappings.
- You can use full pushdown optimization when an ODBC connection is used to connect to the Microsoft Azure SQL Data Warehouse database.
- SSL-enabled Kafka connections
- You can use SSL-enabled Kafka connections for streaming mappings.
Deferment
This section describes the deferment changes in version 10.4.0.
Deferment Lifted
Effective in version 10.4.0, the following functionalities are no longer deferred:
- •Data Masking transformation in streaming mappings.
- •Kerberos cross realm authentication.
- •Monitoring statistics for streaming jobs.