New Features (9.6.1 HotFix 1)
This section describes new features in version 9.6.1 HotFix 1.
Big Data
This section describes new big data features in version 9.6.1 HotFix 1.
Data Warehousing
Big Data Edition has the following new features and enhancements for data warehousing:
- Binary Data Type
Effective in version 9.6.1 HotFix 1, a mapping in the Hive environment can process binary data when it passes through the ports in a mapping. However, the mapping cannot process expression functions that use binary data.
For more information, see the Informatica 9.6.1 HotFix 1 Big Data Edition User Guide.
- Truncate Partitions in a Hive Target
Effective in version 9.6.1 HotFix 1, the Data Integration Service can truncate the partition in the Hive target. You must choose to both truncate the partition in the Hive target and truncate the target table.
For more information, see the Informatica 9.6.1 HotFix 1 Big Data Edition User Guide.
Hadoop Distributions
Effective in version 9.6.1 HotFix 1, Big Data Edition added support for the following Hadoop distributions:
- •Cloudera CDH 5.1
- •Hortonworks HDP 2.1
Big Data Edition dropped support for Hortonworks HDP 2.0.
For more information, see the Informatica 9.6.1 HotFix 1 Big Data Edition Installation and Configuration Guide.
Hadoop Ecosystem
Big Data Edition has the following new features and enhancements for the Hadoop ecosystem:
- Cloudera Manager
Effective in version 9.6.1 HotFix 1, you can use Cloudera Manager to distribute the Big Data Edition installation as parcels across the Hadoop cluster nodes for Cloudera CDH 5.1.
For more information, see the Informatica 9.6.1 HotFix 1 Big Data Edition Installation and Configuration Guide.
- High Availability
Effective in version 9.6.1 HotFix 1, you can enable the Data Integration Service and the Developer tool to read from and write to a highly available Hadoop cluster. A highly available Hadoop cluster can provide uninterrupted access to the JobTracker, NameNode, and ResourceManager in the cluster. You must configure the Developer tool to communicate with a highly available Hadoop cluster on a Hadoop distribution.
For more information, see the Informatica 9.6.1 HotFix 1 Big Data Edition Installation and Configuration Guide.
- Kerberos Authentication
Effective in version 9.6.1 HotFix 1, you can configure the Informatica domain that uses Kerberos authentication to run mappings in a Hadoop cluster that also uses Kerberos authentication. You must configure a one-way cross-realm trust to enable the Hadoop cluster to communicate with the Informatica domain.
Previously, you could run mappings in a Hadoop cluster that used Kerberos authentication if the Informatica domain did not use Kerberos authentication.
For more information, see the Informatica 9.6.1 HotFix 1 Big Data Edition User Guide.
- Schedulers
- Effective in version 9.6.1 HotFix 1, the following schedulers are valid for Hadoop distributions:
- - Capacity scheduler
- - Fair scheduler
For more information, see the Informatica 9.6.1 HotFix 1 Big Data Edition Installation and Configuration Guide.
Business Glossary
This section describes new Business Glossary features in version 9.6.1 HotFix 1.
- Export Relationship View Diagram
Effective in version 9.6.1 HotFix 1, you can export the relationship view diagram after you open it. Export the relationship view diagram to access the diagram when you are not logged in to the Analyst tool or to share the diagram with users who cannot access Business Glossary.
For more information, see the Informatica 9.6.1 HotFix 1 Business Glossary Guide.
- Multi-valued Attributes in Business Glossary Desktop
- Effective in version 9.6.1 HotFix 1, you can view multi-valued attributes in Business Glossary Desktop. Previously, you could only view single-valued attributes. Properties such as Contains and See Also are examples of multi-valued attributes.
Command Line Programs
This section describes new and changed commands and options for the Informatica command line programs in version 9.6.1 HotFix 1.
pmrep Command
Effective in version 9.6.1 HotFix 1, the following table describes an updated pmrep command:
Command | Description |
---|
PurgeVersion | Contains the following new option: -k (log objects not purged). Optional. Lists all the object names and versions that do not purge although they match the purge criteria. The -k option also lists the reason that the object versions did not purge. For example, an object version does not purge if you do not have sufficient privileges to purge the object. |
isp Commands
Effective in version 9.6.1 HotFix 1, the following table describes new isp commands:
Command | Description |
---|
convertUserActivityLog | Converts binary user activity logs to text or XML format. |
getUserActivityLog | Retrieves user activity logs in binary, text, or XML format. |
migrateUsers | Migrates the groups, roles, privileges and permissions of users in a native security domain to users in one or more LDAP security domains. Requires a user migration file. |
Connectivity
This section describes new connectivity features in version 9.6.1 HotFix 1.
Netezza Connectivity
Effective in version 9.6.1 HotFix 1, you can use ODBC to read data from and write data to a Netezza database.
For more information, see the Informatica 9.6.1 HotFix 1 Developer Tool Guide.
Data Quality Accelerators
This section describes new Data Quality accelerator features in version 9.6.1 HotFix 1.
Data Cleansing Rules
Effective in version 9.6.1 HotFix 1, you can select the following rule when you add the Core accelerator to a Model repository project:
- rule_GTIN_Validation
- Validates a Global Trade Item Number (GTIN). The rule validates eight-dight, twelve-digit, thirteen-digit, and fourteen-digit numbers. The rule returns "Valid" if the check digit is correct for the number and "Invalid" if the check digit is incorrect.
Find the rule in the General_Data_Cleansing folder of the accelerator project in the Model repository.
For more information, see the Informatica 9.6.1 HotFix 1 Accelerator Guide.
Matching Rules
Effective in version 9.6.1 HotFix 1, all Data Quality accelerator rules that perform match analysis contain a pass-through input port and a pass-through output port. Use the ports to pass unique identifiers through a rule.
Find the rules in the Matching_Deduplication folder of the accelerator project in the Model repository.
For more information, see the Informatica 9.6.1 HotFix 1 Accelerator Guide.
Documentation
This section describes new or updated guides included with the Informatica documentation in version 9.6.1 HotFix 1.
The Informatica documentation contains the following changed guide:
- Informatica Business Glossary Version 2.0 API Reference Guide
- Effective in version 9.6.1 HotFix 1, a new version of the guide contains URLs and parameters of the Business Glossary REST APIs used to develop a client application.
Informatica Developer
This section describes new Informatica Developer features in version 9.6.1 HotFix 1.
Customized Data Object Write Properties
Effective in version 9.6.1 HotFix 1, the Truncate Hive Target Partition property is added to the customized data object write properties. This property overwrites the partition in the Hive target in which the data is being inserted. To enable this option, you must also select the option to truncate target tables.
For more information, see the Informatica 9.6.1 HotFix 1 Developer Tool Guide.
Netezza Pushdown Optimization
Effective in version 9.6.1 HotFix 1, the Data Integration Service can push transformation logic to Netezza sources that use native drivers.
For more information, see the Informatica 9.6.1 HotFix 1 Mapping Guide.
Secure Communication for SAP HANA
Effective in version 9.6.1 HotFix 1, you can configure secure communication to an SAP HANA database with the SSL protocol.
Informatica Domain
This section describes new Informatica domain features in version 9.6.1 HotFix 1.
Effective in version 9.6.1 HotFix 1, you can install Informatica services on a Windows or Linux operating system running on an Amazon EC2 instance.
Informatica Transformations
This section describes new Informatica transformation features in version 9.6.1 HotFix 1.
Address Validator Transformation
Effective in version 9.6.1 HotFix 1, you can select the following ports on the Address Validator transformation:
- Input Data
- Output port that contains the data elements in an input address record in a structured XML format.
- Result
- Output port that contains data elements that represent the data in an output address in a structured XML format.
Find the Input Data port and the Result port in the XML port group on the transformation.
For more information, see the Informatica 9.6.1 HotFix 1 Address Validator Port Reference.
Mappings
This section describes new mapping features in version 9.6.1 HotFix 1.
Informatica Mappings
Branch Pruning Optimization Method
Effective in version 9.6.1 HotFix 1, the Data Integration Service can apply the branch pruning optimization method. When the Data Integration Service applies the branch pruning method, it removes transformations that do not contribute any rows to the target in a mapping.
The Developer tool enables the branch pruning optimization method by default when you choose the normal or full optimizer level. You can disable branch pruning if the optimization does not increase performance by setting the optimizer level to minimal or none.
For more information, see the Informatica Data Services 9.6.1 HotFix 1 Performance Tuning Guide.
Constraints
Effective in version 9.6.1 HotFix 1, the Data Integration Service can read constraints from relational sources, logical data objects, physical data objects, or virtual tables. A constraint is a conditional expression that the values on a data row must satisfy. When the Data Integration Service reads constraints, it might drop the rows that do not evaluate to TRUE for the data rows based on the optimization method applied.
For more information, see the Informatica 9.6.1 HotFix 1 Mapping Guide.
Metadata Manager
This section describes new Metadata Manager features in verison 9.6.1 HotFix 1.
Browser Support
Effective in version 9.6.1 HotFix 1, the Metadata Manager application can run in the following web browsers:
- •Internet Explorer 11.0
- •Google Chrome 35
For more information about product requirements and supported platforms, see the Product Availability Matrix on the Informatica My Support Portal:
https://mysupport.informatica.com/community/my-support/product-availability-matricesMicrosoft SQL Server and Oracle Exadata Versions
Effective in version 9.6.1 HotFix 1, Metadata Manager supports the following database versions:
- •Microsoft SQL Server 2014
- •Oracle Exadata 11g
Therefore, you can perform the following actions:
- •Create Microsoft SQL Server or Oracle resources that extract metadata from these database versions.
- •Create Business Glossary, Informatica Platform, or PowerCenter resources when the Model repository or PowerCenter repository is in either of these database versions.
- •Create the Metadata Manager repository in either of these database versions.
For more information about creating resources, see the Informatica 9.6.1 HotFix 1 Metadata Manager Administrator Guide. For more information about creating the Metadata Manager repository, see the Informatica 9.6.1 HotFix 1 Installation and Configuration Guide.
Security Enhancements
Effective in version 9.6.1 HotFix 1, when you create or edit a PowerCenter resource, you can prevent Metadata Manager from displaying secure JDBC parameters that are part of the JDBC URL for the PowerCenter repository database.
For more information, see the Informatica 9.6.1 HotFix 1 Metadata Manager Administrator Guide.
PowerCenter
This section describes new PowerCenter features in version 9.6.1 HotFix 1.
Secure Communication for SAP HANA
Effective in version 9.6.1 HotFix 1, you can configure secure communication to an SAP HANA database with the SSL protocol.
PowerExchange Adapters
This section describes new PowerExchange adapter features in version 9.6.1 HotFix 1.
PowerExchange Adapters for Informatica
This section describes new Informatica adapter features in version 9.6.1 HotFix 1.
PowerExchange for Cassandra
Effective in version 9.6.1 HotFix 1, you can use PowerExchange for Cassandra to read data from or write data to a Cassandra database. You can add a Cassandra data object as a source or a target in a mapping and run the mapping to read or write data. You can create virtual tables to use Cassandra collections in a mapping.
For more information, see the Informatica PowerExchange for Cassandra 9.6.1 HotFix 1 User Guide.
PowerExchange for Greenplum
Effective in version 9.6.1 HotFix 1, you can configure secure communication to a Greenplum database with the SSL protocol.
For more information, see the Informatica PowerExchange for Greenplum 9.6.1 HotFix 1 User Guide.
PowerExchange for HBase
Effective in version 9.6.1 HotFix 1, you can use PowerExchange for HBase to connect to an HBase data store that uses Kerberos authentication. You must enable Kerberos authentication and configure HBase connection properties to access an HBase data store that uses Kerberos authentication.
For more information, see the Informatica PowerExchange for HBase 9.6.1 HotFix 1 User Guide.
PowerExchange for HDFS
Effective in version 9.6.1 HotFix 1, when you read complex files, you can use the com.informatica.adapter.hdfs.hadoop.io.InfaBatchTextInputFormat input format to read text files in batches and increase performance.
For more information, see the Informatica PowerExchange for HDFS 9.6.1 HotFix 1 User Guide.
PowerExchange for Hive
Effective in version 9.6.1 HotFix 1, PowerExchange for Hive supports the Binary data type in a Hive environment. The Binary data type has a range of 1 to 104,857,600 bytes.
For more information, see the Informatica PowerExchange for Hive 9.6.1 HotFix 1 User Guide.
PowerExchange for Salesforce
Effective in version 9.6.1 HotFix 1, you can use the PowerExchange for Salesforce connection listed under the Cloud connection category to read data from and write data to Salesforce. You can add a Salesforce data object operation as a source or a target in a mapping and run the mapping to read or write data.
For more information, see the Informatica PowerExchange for Salesforce 9.6.1 HotFix 1 User Guide.
PowerExchange for SAS
Effective in version 9.6.1 HotFix 1, you can use PowerExchange for SAS to read data from SAS and write data to SAS.
For more information, see the Informatica PowerExchange for SAS 9.6.1 HotFix 1 User Guide.
PowerExchange for Tableau
Effective in version 9.6.1 HotFix 1, you can use PowerExchange for Tableau to generate the Tableau data extract file by reading data from multiple sources, such as flat files and SAP applications. Business users can open the extract file in Tableau Desktop to visualize the data and identify patterns and trends.
For more information, see the Informatica PowerExchange for Tableau 9.6.1 HotFix 1 User Guide.
PowerExchange Adapters for PowerCenter
This section describes new PowerCenter adapter features in version 9.6.1 HotFix 1.
PowerExchange for Cassandra
Effective in version 9.6.1 HotFix 1, you can use PowerExchange for Cassandra to extract data from and load data to a Cassandra database. You can create virtual tables to use Cassandra collections in a mapping.
For more information, see the Informatica PowerExchange for Cassandra 9.6.1 HotFix 1 User Guide for PowerCenter.
PowerExchange for Greenplum
Effective in version 9.6.1 HotFix 1, you can configure secure communication to a Greenplum database with the SSL protocol.
For more information, see the Informatica PowerExchange for Greenplum 9.6.1 HotFix 1 User Guide for PowerCenter.
PowerExchange for Vertica
Effective in version 9.6.1 HotFix 1, you can use PowerExchange for Vertica to write large volumes of data to a Vertica database.
For more information, see the Informatica PowerExchange for Vertica 9.6.1 HotFix 1 User Guide for PowerCenter.
Reference Data
This section describes new reference data features in version 9.6.1 HotFix 1.
Probabilistic Models
Effective in version 9.6.1 HotFix 1, you can view the total number of reference data values that you assigned to a label in a probabilistic model.
You can use wildcard characters to search for data values in a probabilistic model.
For more information, see the Informatica 9.6.1 HotFix 1 Reference Data Guide.
Rule Specifications
This section describes new rule specification features in version 9.6.1HotFix 1.
Date and Time Operations
Effective in version 9.6.1 HotFix 1, you can configure a rule statement to perform the following operations on date and time data:
- •Return the date and time at which the Data Integration Service runs the mapping that contains the rule statement.
- •Determine if a time stamp references a point in time before or after the Data Integration Service runs the mapping that contains the rule statement.
- •Convert a string of date and time data to a date/time data type.
For more information, see the Informatica 9.6.1 HotFix 1 Rule Specification Guide.
Reference Table Operations
Effective in version 9.6.1 HotFix 1, you can configure a rule statement to return a value that you specify when an input value matches a reference table value.
For more information, see the Informatica 9.6.1 HotFix 1 Rule Specification Guide.