Version 9.6.0
This section describes new features and enhancements in version 9.6.0.
Informatica Analyst
This section describes new features and enhancements to Informatica Analyst.
Informatica Analyst Interface
The Analyst tool interface has new headers and workspaces. A workspace is a web page where you perform tasks based on licensed functionality that you access through tabs in the Analyst tool.
The Analyst tool has the following workspaces:
- •Start. Access other workspaces that you have the license to access through access panels on this workspace. If you have the license to perform exception management, your tasks appear in this workspace.
- •Glossary. Define and describe business concepts that are important to your organization.
- •Discovery. Analyze the quality of data and metadata in source systems.
- •Design. Design business logic that helps analysts and developers collaborate.
- •Scorecards. Open, edit, and run scorecards that you created from profile results.
- •Library. Search for assets in the Model repository. You can also view metadata in the Library workspace.
- •Exceptions. View and manage exception record data for a task. View duplicate record clusters or exception records based on the type of task you are working on. View an audit trail of the changes you make to records in a task.
- •Connections. Create and manage connections to import relational data objects, preview data, run a profile, and run mapping specifications.
- •Data Domains. Create, manage, and remove data domains and data domain groups.
- •Job Status. Monitor the status of Analyst tool jobs such as data preview for all objects and drilldown operations on profiles.
- •Projects. Create and manage folders and projects and assign permissions on projects.
- •Glossary Security. Manage permissions, privileges, and roles for business glossary users.
Informatica Analyst Tasks
The Analyst tool is available to multiple Informatica products and is used by business users to collaborate on projects within an organization.
The tasks that you can perform in the Analyst tool depend on the license for Informatica products and the privileges to perform tasks. Based on the license that your organization has, you can use the Analyst tool to perform the following tasks:
- •Define business glossaries, terms, and policies to maintain standardized definitions of data assets in the organization.
- •Perform data discovery to find the content, quality, and structure of data sources, and monitor data quality trends.
- •Define data integration logic and collaborate on projects to accelerate project delivery.
- •Define and manage rules to verify data conformance to business policies.
- •Review and resolve data quality issues to find and fix data quality issues in the organization.
Flat File Delimiters
When you import a delimited flat file, you can input the following non-printing multibyte characters as delimiters: /01, /01, and /001.
For more information, see the Informatica 9.6.0 Analyst Tool Guide.
Informatica Installer
This section describes new features and enhancements to the Informatica platform installer.
Accessibility and Section 508 Compliance
The Informatica platform installer conforms to Section 508 of the Rehabilitation Act and is accessible to people with disabilities.
Authentication
You can configure the Informatica domain to use Kerberos authentication. When you install the Informatica services, you can enable Kerberos authentication for the domain. A page titled Domain - Network Authentication Protocol appears in the Informatica services installer. To install the domain with Kerberos authentication, select the option to enable Kerberos authentication and enter the required parameters.
Encryption Key
Informatica encrypts sensitive data such as passwords when it stores data in the domain. Informatica uses a keyword to generate a unique encryption key with which to encrypt sensitive data stored in the domain.
A page titled Domain - Encryption Key appears in the Informatica services installer. If you create a node and a domain during installation, you must specify a keyword for Informatica to use to generate a unique encryption key for the node and domain. If you create a node and join a domain, Informatica uses the same encryption key for the new node.
Secure Communication
You can provide an SSL certificate or use the default Informatica SSL certificate to secure communication between services in the domain. To use your SSL certificate, specify a keystore and truststore file and password during installation.
For more information, see the Informatica 9.6.0 installation and upgrade guides.
Informatica Data Explorer
This section describes new features and enhancements to Informatica Data Explorer.
Column Profile Results
The column profile results include the sum of all values in columns with a numeric datatype.
For more information, see the Informatica Data Explorer 9.6.0 Data Discovery Guide.
Use the TOTAL_SUM column in the following relational database views to access the profiling warehouse for information about the sum of values in numeric columns:
- •IDPV_COL_PROFILE_RESULTS
- •IDPV_PROFILE_RESULTS_TRENDING
For more information, see the Informatica 9.6.0 Database View Reference.
Curation
You can curate inferred profile results in both Analyst and Developer tools. Curation is the process of validating and managing discovered metadata of a data source so that the metadata is fit for use and reporting. You can approve, reject, and restore datatypes. You can also approve, reject, and restore data domains, primary keys, and foreign keys. You can hide or show rows containing rejected datatypes or data domains. You can exclude approved datatypes, data domains, and primary keys from column profile inference and data domain discovery inference when yo run the profile again.
For more information, see the Informatica Data Explorer 9.6.0 Data Discovery Guide.
Use the following relational database views to access profiling warehouse for information about curated profile results:
- •IDPV_CURATED_DATATYPES
- •IDPV_CURATED_DATADOMAINS
- •IDPV_CURATED_PRIMARYKEYS
- •IDPV_CURATED_FOREIGNKEYS
For more information, see the Informatica 9.6.0 Database View Reference.
Data Domain Discovery
You can run data domain discovery on all rows of the source data to verify the inference results for multiple columns at the same time.
For more information, see the Informatica Data Explorer 9.6.0 Data Discovery Guide.
Datatype Inference
You can infer multiple datatypes that match the inference criteria when you run a column profile. You can drill down based on a column datatype in column profile results.
For more information, see the Informatica Data Explorer 9.6.0 Data Discovery Guide.
Use the following relational database views to access profiling warehouse for information on inferred datatypes:
- •IDPV_DATATYPES_INF_RESULTS
- •IDPV_DATATYPE_FREQ_TRENDING
For more information, see the Informatica 9.6.0 Database View Reference.
Discovery Search
Discovery search finds assets and identifies relationships to other assets in the databases and schemas of the enterprise. You can use discovery search to find where the data and metadata exists in the enterprise. You can find physical data sources and data object relationships or you can identify the lack of documented data object relationships. You can view the direct matches, indirect matches, and related assets from the discovery search results.
If you perform a global search, the Analyst tool performs a text-based search for data objects, datatypes, and folders. If you perform discovery search, in addition to the text matches, search results include objects with relationships to the objects that match the search criteria.
For more information, see the Informatica Data Explorer 9.6.0 Data Discovery Guide.
Enterprise Discovery
You can perform enterprise discovery in Informatica Analyst. The enterprise discovery includes column profile and data domain discovery.
For more information, see the Informatica Data Explorer 9.6.0 Data Discovery Guide.
Profile Results Verification
You can verify multiple inferred primary key and functional dependency results for a single data object in the Developer tool. When you verify the profile results, the Developer tool runs the profile on all rows of the source data. You can also verify multiple data object relationships and data domains in the enterprise discovery results.
For more information, see the Informatica Data Explorer 9.6.0 Data Discovery Guide.
Scorecards
You can export scorecard results to a Microsoft Excel file. The exported file contains scorecard summary, trend charts, rows that are not valid, and scorecard properties.
For more information, see the Informatica Data Explorer 9.6.0 Data Discovery Guide.
Support for bigint Datatype
You can run a profile on a data source with a large number of rows, such as many billions of rows. The profiling warehouse uses the bigint column to handle large volumes of source data.
For more information, see the Informatica Data Explorer 9.6.0 Data Discovery Guide.
Informatica Data Quality
This section describes new features and enhancements to Informatica Data Quality.
Accelerators
The set of Informatica accelerators has the following additions:
- •Informatica Data Quality Accelerator for Spain. Contains rules, reference tables, demonstration mappings, and demonstration data objects that solve common data quality issues in Spanish data.
- •Informatica Data Quality Accelerator for Data Discovery. Contains rules, reference tables, demonstration mappings, and demonstration data objects that you can use to perform data discovery operations.
For more information, see the Informatica Data Quality 9.6.0 Accelerator Guide.
Address Validation
You can configure the following advanced properties on the Address Validator transformation:
- Dual Address Priority
- Determines the type of address to validate. Set the property when input address records contain more than one type of valid address data.
- Flexible Range Expansion
- Imposes a practical limit on the number of suggested addresses that the transformation returns when there are multiple valid addresses on a street. Set the property when you set the Ranges to Expand property.
- Geocode Data Type
Determines how the transformation calculates geocode data for an address. Geocodes are latitude and longitude coordinates. Set the property to return the following types of geocode data:
- - The latitude and longitude coordinates of the entrance to a building or a plot of land.
- - The latitude and longitude coordinates of the geographic center of a plot of land.
- The transformation can also estimate the latitude and longitude coordinates for an address. Estimated geocodes are called interpolated geocodes.
- Global Max Field Length
- Determines the maximum number of characters on any line in the address. Set the property to verify that the line length in an address does not exceed the requirements of the local mail carrier.
- Ranges To Expand
- Determines how the transformation returns suggested addresses for a street address that does not specify a house number. Set the property to increase or decrease the range of suggested addresses for the street.
- Standardize Invalid Addresses
- Determines if the transformation standardizes data values in an undeliverable address. Set the property to simplify the terminology in the address record so that downstream data processes can run more efficiently.
You can configure the following address validation process property in the Administrator tool:
- SendRight Report Location
- The location to which address validation writes a SendRight report and any log file that relates to the creation of the report. Generate a SendRight report to verify that a set of New Zealand address records meets the certification standards of New Zealand Post.
Note: You configure the Address Validator transformation to create a SendRight report file.
For more information, see the Informatica 9.6.0 Developer Transformation Guide.
Automatic Workflow Recovery
You can configure automatic recovery of aborted workflow instances due to an unexpected shutdown of the Data Integration Service process. When you configure automatic recovery, the Data Integration Service process recovers aborted workflow instances due to a service process shutdown when the service process restarts.
For more information, see the Informatica 9.6.0 Developer Workflow Guide.
Business Glossary
Business Glossary comprises online glossaries of business terms and policies that define important concepts within an organization. Data stewards create and publish terms that include information such as descriptions, relationships to other terms, and associated categories. Glossaries are stored in a central location for easy lookup by end-users.
Business Glossary is made up of glossaries, business terms, policies, and categories. A glossary is the high-level container that stores other glossary content. A business term defines relevant concepts within the organization, and a policy defines the business purpose that governs practises related to the term. Business terms and policies can be associated with categories, which are descriptive classifications. You can access Business Glossary through Informatica Analyst (the Analyst tool).
For more information, see the Informatica 9.6.0 Business Glossary Guide.
Column Profile Results
The column profile results include the sum of all values in columns with a numeric datatype.
For more information, see the Informatica Data Explorer 9.6.0 Data Discovery Guide.
Use the TOTAL_SUM column in the following relational database views to access the profiling warehouse for information about the sum of values in numeric columns:
- •IDPV_COL_PROFILE_RESULTS
- •IDPV_PROFILE_RESULTS_TRENDING
For more information, see the Informatica 9.6.0 Database View Reference.
Curation
You can curate inferred profile results in both Analyst and Developer tools. Curation is the process of validating and managing discovered metadata of a data source so that the metadata is fit for use and reporting. You can approve, reject, and restore datatypes. You can also approve, reject, and restore data domains, primary keys, and foreign keys. You can hide or show rows containing rejected datatypes or data domains. You can exclude approved datatypes, data domains, and primary keys from column profile inference and data domain discovery inference when yo run the profile again.
For more information, see the Informatica Data Explorer 9.6.0 Data Discovery Guide.
Use the following relational database views to access profiling warehouse for information about curated profile results:
- •IDPV_CURATED_DATATYPES
- •IDPV_CURATED_DATADOMAINS
- •IDPV_CURATED_PRIMARYKEYS
- •IDPV_CURATED_FOREIGNKEYS
For more information, see the Informatica 9.6.0 Database View Reference.
Datatype Inference
You can infer multiple datatypes that match the inference criteria when you run a column profile. You can drill down based on a column datatype in column profile results.
For more information, see the Informatica Data Explorer 9.6.0 Data Discovery Guide.
Use the following relational database views to access profiling warehouse for information on inferred datatypes:
- •IDPV_DATATYPES_INF_RESULTS
- •IDPV_DATATYPE_FREQ_TRENDING
For more information, see the Informatica 9.6.0 Database View Reference.
Identity Index Data Persistence
You can configure a Match transformation to write the identity index data for a data source to database tables. You can configure a Match transformation to compare a data source to the identity index data in the database tables. The stored index data for one of the two data sources means that the identity match mappings take less time to run.
When you configure a Match transformation to read index tables, you control the types of record that the transformation analyzes and the types of output that the transformation generates. You can configure the transformation to analyze all the records in the data sources or a subset of the records. You can configure the transformation to write all records as output or a subset of the records.
For more information, see the Informatica 9.6.0 Developer Transformation Guide.
Java Transformation
In a Java transformation, you can configure an input port as a partition key, a sort key, and assign a sort direction. The partition key and sort key are valid when you process the transformation in a mapping that runs in a Hive environment.
For more information, see the Informatica 9.6.0 Developer Transformation Guide.
Lookup Transformation
If you cache the lookup source for a Lookup transformation, you can use a dynamic cache to update the lookup cache based on changes to the target. The Data Integration Service updates the cache before it passes each row to the target.
For more information, see the Informatica 9.6.0 Developer Transformation Guide.
Normalizer Transformation
The Normalizer transformation is an active transformation that transforms one source row into multiple output rows. When a Normalizer transformation receives a row that contains repeated fields, it generates an output row for each instance of the repeated data.
Use the Normalizer transformation when you want to organize repeated data from a relational or flat file source before you load the data to a target.
For more information, see the Informatica 9.6.0 Developer Transformation Guide.
Performance
In the Developer tool you can enable a mapping to perform the following optimizations:
- •Push a Union transformation to a relational data object.
- •Push Filter, Expression, Union, Sorter, and Aggregator transformations to a Hive relational object.
For more information, see the Informatica 9.6.0 Mapping Guide.
Profile Results Verification
You can verify multiple inferred primary key and functional dependency results for a single data object in the Developer tool. When you verify the profile results, the Developer tool runs the profile on all rows of the source data. You can also verify multiple data object relationships and data domains in the enterprise discovery results.
For more information, see the Informatica Data Explorer 9.6.0 Data Discovery Guide.
Pushdown Optimization
The Data Integration Service can push expression, aggregator, operator, union, sorter, and filter functions to Greenplum sources when the connection type is ODBC.
For more information, see the Informatica 9.6.0 Mapping Guide.
Rule Builder
Rule Builder is an Informatica Analyst feature that converts business rule requirements to transformation logic. You save the business rule requirements in a rule specification. When you compile the rule specification, the Analyst tool creates transformations that can analyze the business data according to the requirements that you defined. The Analyst tool saves the transformations to one or more mapplets in the Model repository.
A rule specification contains one or more IF-THEN statements. The IF-THEN statements use logical operators to determine if the input data satisfies the conditions that you specify. You can use AND operators to link IF statements and verify that a data value satisfies multiple conditions concurrently. You can define statements that compare data from different inputs and test the inputs under different mathematical conditions. You can also link statements so that the output from one statement becomes the input to another.
Rule Builder represents a link between business users and the Informatica development environment. Business users can log in to the Analyst tool to create mapplets. Developer tool users add the mapplets to mappings and verify that the business data conforms to the business rules.
For more information, see the Informatica 9.6.0 Rule Builder Guide.
Scorecards
You can export scorecard results to a Microsoft Excel file. The exported file contains scorecard summary, trend charts, rows that are not valid, and scorecard properties.
For more information, see the Informatica Data Explorer 9.6.0 Data Discovery Guide.
Sequence Generator Transformation
Effective in 9.6.0, you can use the Sequence Generator transformation to add a sequence of values to your mappings.
For more information, see the Informatica 9.6.0 Developer Transformation Guide.
Informatica Data Services
This section describes new features and enhancements to Informatica Data Services.
Column Profile Results
The column profile results include the sum of all values in columns with a numeric datatype.
For more information, see the Informatica Data Explorer 9.6.0 Data Discovery Guide.
Use the TOTAL_SUM column in the following relational database views to access the profiling warehouse for information about the sum of values in numeric columns:
- •IDPV_COL_PROFILE_RESULTS
- •IDPV_PROFILE_RESULTS_TRENDING
For more information, see the Informatica 9.6.0 Database View Reference.
Curation
You can curate inferred profile results in both Analyst and Developer tools. Curation is the process of validating and managing discovered metadata of a data source so that the metadata is fit for use and reporting. You can approve, reject, and restore datatypes. You can also approve, reject, and restore data domains, primary keys, and foreign keys. You can hide or show rows containing rejected datatypes or data domains. You can exclude approved datatypes, data domains, and primary keys from column profile inference and data domain discovery inference when yo run the profile again.
For more information, see the Informatica Data Explorer 9.6.0 Data Discovery Guide.
Use the following relational database views to access profiling warehouse for information about curated profile results:
- •IDPV_CURATED_DATATYPES
- •IDPV_CURATED_DATADOMAINS
- •IDPV_CURATED_PRIMARYKEYS
- •IDPV_CURATED_FOREIGNKEYS
For more information, see the Informatica 9.6.0 Database View Reference.
Datatype Inference
You can infer multiple datatypes that match the inference criteria when you run a column profile. You can drill down based on a column datatype in column profile results.
For more information, see the Informatica Data Explorer 9.6.0 Data Discovery Guide.
Use the following relational database views to access profiling warehouse for information on inferred datatypes:
- •IDPV_DATATYPES_INF_RESULTS
- •IDPV_DATATYPE_FREQ_TRENDING
For more information, see the Informatica 9.6.0 Database View Reference.
Data Masking Transformation
The Data Masking transformation has the following new features in this release:
- •The Data Masking transformation is supported on Hadoop clusters. You can run the transformation in a Hive environment.
- •Tokenization is a masking technique in which you can provide JAR files with your own algorithm or logic to mask string data.
- •You can use the Phone masking technique to mask fields with numeric integer and numeric bigint datatypes.
For more information, see the Informatica 9.6.0 Developer Transformation Guide.
Java Transformation
In a Java transformation, you can configure an input port as a partition key, a sort key, and assign a sort direction. The Partition key and Sort key are valid when you process the transformation in a mapping that runs in a Hive environment.
For more information, see the Informatica 9.6.0 Developer Transformation Guide.
Normalizer Transformation
The Normalizer transformation is an active transformation that transforms one source row into multiple output rows. When a Normalizer transformation receives a row that contains repeated fields, it generates an output row for each instance of the repeated data.
Use the Normalizer transformation when you want to organize repeated data from a relational or flat file source before you load the data to a target.
For more information, see the Informatica 9.6.0 Developer Transformation Guide.
Performance
In the Developer tool you can enable a mapping to perform the following optimizations:
- •Push a custom SQL query to a relational data object.
- •Push operations such as Union, Union All, Intersect, Intersect All, Minus, Minus All, and Distinct to a relational data object.
- •Perform early selection and push queries that contain the SQL keyword LIMIT to a relational data object.
- •Push a Union transformation to a relational data object.
- •Push Filter, Expression, Union, Sorter, and Aggregator transformations to a Hive relational object.
For more information, see the Informatica 9.6.0 Developer User Guide, Informatica 9.6.0 SQL Data Service Guide, and Informatica 9.6.0 Mapping Guide.
Profile Results Verification
You can verify multiple inferred primary key and functional dependency results for a single data object in the Developer tool. When you verify the profile results, the Developer tool runs the profile on all rows of the source data. You can also verify multiple data object relationships and data domains in the enterprise discovery results.
For more information, see the Informatica Data Explorer 9.6.0 Data Discovery Guide.
Pushdown Optimization for Greenplum
The Data Integration Service can push expression, aggregator, operator, union, sorter, and filter functions to Greenplum sources when the connection type is ODBC.
For more information, see the Informatica 9.6.0 Mapping Guide.
Pushdown Optimization for SAP HANA
The Data Integration Service can push transformation logic to SAP HANA sources when the connection type is ODBC.
For more information, see the Informatica 9.6.0 Mapping Guide.
Pushdown Optimization for Teradata
The Data Integration Service can push transformation logic to Teradata sources when the connection type is ODBC.
For more information, see the Informatica 9.6.0 Mapping Guide.
REST Web Service Consumer Transformation
The REST Web Service Consumer transformation consumes REST web services in a mapping. The transformation can use GET, PUT, POST, and DELETE HTTP operations.
You can create a REST Web Service Consumer transformation from a Schema object or add elements to an empty transformation.
For more information, see the Informatica 9.6.0 Developer Transformation Guide.
Scorecards
You can export scorecard results to a Microsoft Excel file. The exported file contains scorecard summary, trend charts, rows that are not valid, and scorecard properties.
For more information, see the Informatica Data Explorer 9.6.0 Data Discovery Guide.
Sequence Generator Transformation
You can now use the Sequence Generator transformation to add a sequence of values to your mappings.
For more information, see the Informatica 9.6.0 Developer Transformation Guide.
Stored Procedures
You can use the SQL transformation to invoke stored procedures from a relational database. You can create the SQL transformation in the Developer tool by importing a stored procedure. The Developer tool adds the ports and the stored procedure call. You can manually add more stored procedure calls in the SQL transformation. Return zero rows, one row, or result sets from the stored procedure.
For more information, see the Informatica 9.6.0 Developer Transformation Guide.
Tableau
You can query a deployed SQL data service with Tableau through the Informatica Data Services ODBC driver.
For more information, see the Informatica 9.6.0 Data Services Guide.
Web Service Consumer Transformation
The Web Service Consumer transformation has the following new features in this release:
- •The external web service provider can authenticate the Integration Service using NTLMv2.
- •In a Web Service Consumer transformation, you can use WSDL with one-way message pattern.
For more information, see the Informatica 9.6.0 Developer Transformation Guide.
Informatica Data Transformation
This section describes new features and enhancements to Informatica Data Transformation.
Data Processor Transformation Wizard
You can use a wizard to create a Data Processor transformation in the Developer with COBOL, ASN.1, relational or JSON input or output.
For more information about the wizard, see the Informatica 9.6.0 Data Transformation User Guide.
Relational Input
A Data Processor transformation can transform relational input into hierarchical output.
For more information about relational input, see the Informatica 9.6.0 Data Transformation User Guide.
XMap with JSON
You create an XMap that reads or writes directly to JSON.
For more information about XMap or JSON, see the Informatica 9.6.0 Data Transformation User Guide.
XMap with Transformers
In an XMap mapping statement, you can include any user-defined transformer with the dp:transform function. Use the XPath Editor to add the dp:transform function to the input, output, or condition fields.
For more information about XPath and the XPath editor, see the Informatica 9.6.0 Data Transformation User Guide.
Informatica Developer
This section describes new features and enhancements to Informatica Developer.
Alerts
In the Developer tool, you can view connection status alerts in the Alerts view.
For more information, see the Informatica 9.6.0 Developer Tool Guide.
Functions
In the Developer tool, you can use the following functions in the transformation language:
- •UUID4(). Returns a randomly generated 16-byte binary value.
- •UUID_UNPARSE(binary). Takes a 16-byte binary argument and returns a 36-character string.
For more information, see the Informatica 9.6.0 Developer Transformation Language Reference.
JDBC Connectivity
You can use the Data Integration Service to read from relational database sources and write to relational database targets through JDBC. JDBC drivers are installed with the Informatica services and the Informatica clients. You can also download the JDBC driver that is JDBC 3.0 compliant from third party vendor websites. You can use the JDBC driver to import database objects, such as views and tables, preview data for a transformation, and run mappings.
For more information, see the Informatica 9.6.0 Developer Tool Guide.
Keyboard Accessibility
In the Developer tool, you can use keyboard shortcuts to work with objects and ports in the editor. You can also use keyboard shortcuts to navigate the Transformation palette and the workbench.
For more information, see the Informatica 9.6.0 Developer Tool Guide.
Model Repository Service Refresh
In the Developer tool, you can refresh the Model Repository Service to see new and updated objects in the Model repository.
For more information, see the Informatica 9.6.0 Developer Tool Guide.
Object Dependencies
In the Developer tool, you can view the object dependencies for an object in the Object Dependencies view to perform an impact analysis on affected objects before you modify or delete the object.
For more information, see the Informatica 9.6.0 Developer Tool Guide.
Passphrases
In the Developer tool, you can enter a passphrase instead of a password for following connection types:
- •Adabas
- •DB2 for i5/OS
- •DB2 for z/OS
- •IMS
- •Sequential
- •VSAM
A valid passphrase for accessing databases and data sets on z/OS can be up to 128 characters in length. A valid passphrase for accessing i5/OS can be up to 31 characters in length. Passphrases can contain the following characters:
- •Uppercase and lowercase letters
- •The numbers 0 to 9
- •Spaces
- •The following special characters:
’ - ; # \ , . / ! % & * ( ) _ + { } : @ | < > ?
Note: The first character is an apostrophe.
For more information, see the Informatica 9.6.0 Developer Tool Guide.
Informatica Development Platform
This section describes new features and enhancements to Informatica Development Platform.
Design API
Version 9.6.0 includes the following enhancements for the Design API:
- •You can use the Design API to fetch an XML source or XML target from the PowerCenter repository.
- •You can use Design API to connect to a hierarchical VSAM data source or target through PowerExchange.
- •You can use the Design API to perform repository functions in a domain that uses Kerberos authentication. You can enable Kerberos authentication through the pcconfig.properties file or when you create a Repository object.
For more information, see the Informatica Development Platform 9.6.0 Developer Guide.
Informatica Connector Toolkit
You can use the Informatica Connector Toolkit to build an adapter to provide connectivity between a data source and the Informatica platform. The Informatica Connector Toolkit consists of libraries, plug-ins, and sample codes to develop an adapter in an Eclipse environment.
For more information, see the Informatica Development Platform 9.6.0 Informatica Connector Toolkit Developer Guide.
Informatica Domain
This section describes new features and enhancements to the Informatica domain.
Analyst Service
Version 9.6.0 includes the following enhancements to the Analyst Service:
- •You can select a Data Integration Service configured to run Human tasks. If the Data Integration Service associated with the Analyst Service is not configured to run Human tasks, choose a different Data Integration Service.
- •You can select a Search Service to enable searches in the Analyst tool.
- •You can set the location of the export file directory to export a business glossary.
For more information, see the Informatica 9.6.0 Application Service Guide.
Content Management Service
You can set the location of the SendRight report file on the Content Management Service. Generate a SendRight report when you run an address validation mapping in certified mode on New Zealand address records. The report verifies that the address records meet the certification standards of New Zealand Post.
For more information, see the Informatica 9.6.0 Application Service Guide.
The Content Management Service manages the compilation of rule specifications into mapplets. When you compile a rule specification in the Analyst tool, the Analyst Service selects a Content Management Service to generate the mapplet. The Analyst tool uses the Model Repository Service configuration to select the Content Management Service.
For more information, see the Informatica 9.6.1 Application Service Guide.
High Availability
Version 9.6.0 includes the following enhancements to high availability for services:
- •When the Model Repository Service becomes unavailable, the Service Manager can restart the service on the same node or a backup node. You can configure the Model Repository Service to run on one or more backup nodes.
- •When the Data Integration Service becomes unavailable, the Service Manager can restart the service on the same node or a backup node. You can configure the Data Integration Service to run on one or more backup nodes.
- •When the Data Integration Service fails over or restarts unexpectedly, you can enable automatic recovery of aborted workflows.
- •You can enable the PowerCenter Integration Service to store high availability persistence information in database tables. The PowerCenter Integration Service stores the information in the associated repository database.
For more information, see the Informatica 9.6.0 Administrator Guide.
Log Management
You can aggregate logs at the domain level or service level based on scenarios with the Administrator tool. You can also compress the log files that you aggregate to save disk space.
For more information, see the Informatica 9.6.0 Administrator Guide.
Passphrases
You can enter a passphrase instead of a password at the following locations:
- •In the -ConnectionPassword option of the infacmd isp CreateConnection and UpdateConnection commands for ADABAS, DB2I, DB2Z, IMS, SEQ, or VSAM connections.
- •In the -pwxPassword option of the infacmd pwx createdatamaps command for IMS, SEQ, and VSAM data sources.
- •In the Administrator tool, for DB2 for i5/OS and DB2 for z/OS connections.
A valid passphrase for accessing databases and data sets on z/OS can be up to 128 characters in length. A valid passphrase for accessing i5/OS can be up to 31 characters in length. Passphrases can contain the following characters:
- •Uppercase and lowercase letters
- •The numbers 0 to 9
- •Spaces
- •The following special characters:
’ - ; # \ , . / ! % & * ( ) _ + { } : @ | < > ?
Note: The first character is an apostrophe.
For more information, see the Informatica 9.6.0 Administrator Guide and Informatica 9.6.0 Command Reference.
Search Service
Create a Search Service to enable search in the Analyst tool and Business Glossary Desktop.
For more information, see the Informatica 9.6.0 Application Service Guide.
Workflow Graph
You can view the graphical representation of a workflow that you run in the Administrator tool. You can view the details of the tasks within the workflow and the failure points.
For more information, see the Informatica 9.6.0 Administrator Guide.
Informatica Domain Security
This section describes security enhancements to the Informatica domain.
Authentication
You can run Informatica with Kerberos authentication and Microsoft Active Directory (AD) directory service. Kerberos authentication provides single sign-on capability to Informatica domain client applications. The Informatica domain supports Active Directory 2008 R2.
Two-Factor Authentication (TFA)
Informatica clients can run on a Windows network that uses two-factor authentication.
Encryption Key
You can specify a keyword to generate a unique encryption key for encrypting sensitive data such as passwords that are stored in the domain.
Workflow Security
You can configure the PowerCenter Integration Service to run PowerCenter workflows securely. The Enable Data Encryption option enables secure communication between the PowerCenter Integration Service and the Data Transformation Manager (DTM) process and between DTM processes.
Administrator Group
The Informatica domain includes an Administrator group with default administrator privileges. You can add users to or remove users from the Administrator group. You cannot delete the Administrator group.
Administrator Account Lockout
When you configure account lockout in the Administrator tool, you can enforce account lockout for administrator user accounts. The Admin Account Lockout option enables lockout for administrator user accounts. When you enable the Account Lockout option, you can also enable the Admin Account Lockout option.
Connection to Secure Relational Databases
You can use the Informatica relational database drivers to connect to a secure Oracle, Microsoft SQL Server, or IBM DB2 database. You can create repositories, sources, and targets on databases secured with SSL certificates.
Audit Reports
In the Administrator tool, you can generate audit reports to get information on users and groups in the Informatica domain. For example, you can get information about a user account, such as the privileges and permissions assigned to the user and the groups associated with the user.
Analyst Service Privileges
The following table describes new privileges for the Analyst Service:
Privilege | Description |
---|
Manage Glossaries | User is able to manage business glossaries. |
Workspace Access | User is able to access the following workspaces in the Analyst tool: - - Design workspace.
- - Discovery workspace.
- - Glossary workspace.
- - Scorecards workspace.
|
Design Workspace | User is able to access the Design workspace. |
Discovery Workspace | User is able to access the Discovery workspace. |
Glossary Workspace | User is able to access the Glossary workspace. |
Scorecards Workspace | User is able to access the Scorecards workspace. |
Model Repository Service Privileges
The following table describes new privileges for the Model Repository Service:
Privilege | Description |
---|
Access Analyst | User is able to access the Model repository from the Analyst tool. |
Access Developer | User is able to access the Model repository from the Developer tool. |
For more information, see the Informatica 9.6.0 Security Guide.
Command Line Programs
This section describes new and changed commands and options for the Informatica command line programs.
infacmd as Commands
The following table describes an updated infacmd as command:
Command | Description |
---|
CreateService | Contains the following new options: - - -HumanTaskDataIntegrationService(-htds). Optional. Name of the Data Integration Service that runs Human tasks.
- - -BusinessGlossaryExportFileDirectory(-bgefd). Optional. Location of the directory to export business glossary files.
Contains the following obsolete option: - - -StagingDatabase(-sd). Required. Database connection name for a staging database
|
UpdateServiceOptions | Updates Analyst Service options. In version 9.6.0 you can run the command to specify a Data Integration Service to run Human tasks. For example, the following command configures the Analyst Service to specify DIS_ID_100 as the Data Integration Service name: infacmd as UpdateServiceOptions -dn InfaDomain -sn AS_ID_100 -un Username -pd Password HumanTaskDataIntegrationService.humanTaskDsServiceName=DS_ID_100 |
The following table describes obsolete infacmd as commands:
Command | Description |
---|
CreateAuditTables | Creates audit tables that contain audit trail log events for bad record tables and duplicate tables in a staging database. Update any script that uses infacmd as CreateAuditTables. |
DeleteAuditTables | Creates audit tables that contain audit trail log events for bad record tables and duplicate tables in a staging database. Update any script that uses infacmd as DeleteAuditTables. |
infacmd dis Commands
The following table describes updated infacmd dis commands:
Command | Description |
---|
CreateService | Contains the following new option: - - -BackupNodes(-bn). Optional. Name of the backup nodes.
|
UpdateService | Contains the following new option: - - -BackupNodes(-bn). Optional. Name of the backup nodes.
|
infacmd idd Commands
The infacmd idd commands are obsolete. Update any script that refers to an infacmd idd command.
The following table describes the obsolete infacmd idd commands:
Command | Description |
---|
CreateService | Creates a Data Director Service. |
ListServiceOptions | Lists the Data Director Service options. |
ListServiceProcessOptions | Lists the Data Director Service process options. |
RemoveService | Removes the Data Director Service. |
UpdateServiceOptions | Updates the Data Director Service options. |
UpdateServiceProcessOptions | Updates the Data Director Service process options. |
infacmd isp Commands
The following table describes updated infacmd isp commands:
Command | Description |
---|
AssignISToMMService | Contains the following new option: - - -RepositoryUserSecurityDomain(-rsdn).Optional. Name of the security domain to which the PowerCenter repository user belongs.
|
CreateConnection | Contains the following updated option: - - -ConnectionPassword. You can enter a passphrase for ADABAS, DB2I, DB2Z, IMS, SEQ, or VSAM connections. A passphrase can be up to 128 characters in length for z/OS connections and up to 31 characters in length for DB2 for i5/OS connections. A passphrase can contain letters, numbers, spaces, and some special characters.
|
CreateIntegrationService | Contains the following service option (-so): - - StoreHAPersistenceInDB. Optional. Stores process state information in high availability persistence tables in the associated PowerCenter repository database. Default is no.
|
EnableService | Can enable the Search Service. |
GetLog | Contains the argument SEARCH for the ServiceType option. Use the argument to get the log events for the Search Service. |
ListServices | Contains the argument SEARCH for the ServiceType option. Use the argument to get a list of all Search Services running in the domain. |
UpdateConnection | Contains the following updated option: - - -ConnectionPassword. You can enter a passphrase for ADABAS, DB2I, DB2Z, IMS, SEQ, or VSAM connections. A passphrase can be up to 128 characters in length for z/OS connections and up to 31 characters in length for DB2 for i5/OS connections. A passphrase can contain letters, numbers, spaces, and some special characters.
|
UpdateDomainOptions | Contains the following domain option (-do): - - ServiceResilTimeout. Amount of time in seconds that a service tries to establish or reestablish a connection to another service.
|
UpdateGatewayInfo | Contains the following new option: - - -Force(-f). Optional. Updates or creates the domains.infa file even when the connection to the domain fails. The ‑Force option sets the Kerberos and TLS enabled options as false in the domains.infa file if the connection to domain fails. If you do not specify the ‑Force option, the command does not update the domains.infa file if the connection to the domain fails. Previously, the command could not check for any error message when updating the gateway node with the connectivity information that you specified.
|
UpdateIntegrationService | Contains the following service option (-so): - - StoreHAPersistenceInDB. Optional. Stores process state information in high availability persistence tables in the associated PowerCenter repository database. Default is no.
|
infacmd mrs Commands
The following table describes updated infacmd mrs commands:
Command | Description |
---|
CreateService | Contains the following new option: - - -BackupNodes(-bn). Optional. Name of the backup nodes.
|
UpdateService | Contains the following new option: - - -PrimaryNode(-nn). Optional. Name of the primary node.
- - -BackupNodes(-bn). Optional. Name of the backup nodes.
|
infacmd ps Commands
The following table describes new infacmd ps commands:
Command | Description |
---|
migrateProfileResults | Migrates column profile results and data domain discovery results from versions 9.1.0, 9.5.0, or 9.5.1. |
synchronizeProfile | Migrates documented keys, user-defined keys, committed keys, primary keys, and foreign keys for all the profiles in a specific project from versions 9.1.0, 9.5.0, or 9.5.1. |
infacmd pwx Commands
The following table describes a new infacmd pwx command:
Command | Description |
---|
createdatamaps | Creates PowerExchange data maps for IMS, SEQ, or VSAM data sources for bulk data movement. |
infacmd search Commands
The following table describes the new infacmd search commands:
Command | Description |
---|
createService | Creates a Search Service. |
listServiceOptions | Lists the properties for a Search Service. |
listServiceProcessOptions | Lists the properties for a Search Service process. |
updateServiceOptions | Configures properties for a Search Service. |
updateServiceProcessOptions | Configures properties for a Search Service process. |
For more information, see the Informatica 9.6.0 Command Reference.
PowerCenter
This section describes new features and enhancements to PowerCenter.
Pushdown Optimization for SAP HANA
The PowerCenter Integration Service can push transformation logic to SAP HANA sources and targets when the connection type is ODBC.
For more information, see the Informatica PowerCenter 9.6.0 Advanced Workflow Guide.
High Availability Persistence in a Database
You can enable the PowerCenter Integration Service to store high availability persistence information in database tables. The PowerCenter Integration Service stores the information in the associated repository database.
For more information, see the Informatica 9.6.0 Administrator Guide.
Transformations
You can use a parameter file to provide cache size values in the following transformations:
- •Aggregator
- •Joiner
- •Rank
- •Sorter
For more information, see the Informatica PowerCenter 9.6.1 Transformation Guide.
PowerCenter Big Data Edition
This section describes new features and enhancements to PowerCenter Big Data Edition.
Automatic Workflow Recovery
You can configure automatic recovery of aborted workflow instances due to an unexpected shutdown of the Data Integration Service process. When you configure automatic recovery, the Data Integration Service process recovers aborted workflow instances due to a service process shutdown when the service process restarts.
For more information, see the Informatica 9.6.0 Developer Workflow Guide.
Mappings in the Hive Environment
- •You can run mappings with Cloudera 4.2, Hortonworks 1.3.2, MapR 2.1.3, and MapR 3.0.1 distributions.
- •When you choose Hive as the validation environment for the mapping, you can now choose a Hive version.
- •You can append to a Hive target table with Hive version 0.9 and later.
- •In a Java transformation, you can configure an input port as a partition key, a sort key, and assign a sort direction to get sorted output data.
- •To modify the Hadoop distribution directory on the Hadoop data nodes and the Data Integration Service node use the Hadoop resource descriptor configuration file hadoopRes.properties.
For more information, see the Informatica PowerCenter Big Data Edition 9.6.0 User Guide.
Partitioned Mappings in the Native Environment
If you have the Partitioning option, you can enable the Data Integration Service process to maximize parallelism when it runs mappings in the native environment. The Data Integration Service process must run on a node that has multiple CPUs. When you maximize parallelism, the Data Integration Service dynamically divides the underlying data into partitions and processes all of the partitions concurrently. When the Data Integration Service adds partitions, it increases the number of processing threads, which can increase mapping performance.
For more information, see the Informatica 9.6.0 Mapping Guide.
PowerCenter Advanced Edition
This section describes new features and enhancements to PowerCenter Advanced Edition.
Business Glossary
Business Glossary comprises online glossaries of business terms and policies that define important concepts within an organization. Data stewards create and publish terms that include information such as descriptions, relationships to other terms, and associated categories. Glossaries are stored in a central location for easy lookup by end-users.
Business Glossary is made up of glossaries, business terms, policies, and categories. A glossary is the high-level container that stores other glossary content. A business term defines relevant concepts within the organization, and a policy defines the business purpose that governs practises related to the term. Business terms and policies can be associated with categories, which are descriptive classifications. You can access Business Glossary through Informatica Analyst (the Analyst tool).
For more information, see the Informatica 9.6.0 Business Glossary Guide.
Metadata Manager
This section describes new features and enhancements to Metadata Manager.
Security Enhancements
Metadata Manager contains the following security enhancements:
- Connection to secure relational databases
Metadata Manager can communicate with secure IBM DB2, Microsoft SQL Server, and Oracle databases. Metadata Manager can communicate with these databases when they are used for the Metadata Manager repository, for the PowerCenter repository, or as metadata sources.
For more information, see the Informatica PowerCenter 9.6.0 Metadata Manager Administrator Guide.
- Kerberos authentication
Metadata Manager can run on a domain that is configured with Kerberos authentication.
For information about configuring the domain to use Kerberos authentication, see the Informatica 9.6.0 Security Guide. For information about running Metadata Manager and mmcmd when the domain uses Kerberos authentication, see the Informatica PowerCenter 9.6.0 Metadata Manager Administrator Guide.
- Two-factor authentication
Metadata Manager can run on a Windows network that uses two factor authentication.
For more information, see the Informatica 9.6.0 Security Guide.
Business Glossary Resources
You can create Business Glossary resources that are based on Informatica Analyst business glossaries. Create a Business Glossary resource to extract metadata from an Informatica Analyst business glossary.
For information about creating resources, see the Informatica PowerCenter 9.6.0 Metadata Manager Administrator Guide. For information about viewing resources, see the Informatica PowerCenter 9.6.0 Metadata Manager User Guide.
Resource Versions
You can create resources of the following versions:
- •Microstrategy 9.3.1 and 9.4.1. Previously, you could create Microstrategy resources up to version 9.2.1.
- •Netezza 7.0. Previously, you could create Netezza resources up to version 6.0.
For information about creating resources, see the Informatica PowerCenter 9.6.0 Metadata Manager Administrator Guide.
Browser Support
You can run the Metadata Manager application in the Google Chrome web browser.
PowerExchange Adapters for PowerCenter
This section describes new features and enhancements to PowerExchange adapters for PowerCenter.
- PowerExchange for Greenplum
- You can configure a session to override the schema that is specified in the Greenplum connection object.
- For more information, see the Informatica PowerExchange for Greenplum 9.6.0 User Guide for PowerCenter.
- PowerExchange for Hadoop
- PowerExchange for Hadoop supports following updated versions of Hadoop distributions to access Hadoop sources and targets:
- - Cloudera CDH 4.2
- - Hortonworks 1.3.2
- - MapR 2.1.3 and 3.0.1
- - Pivotal HD 1.1
- - IBM BigInsights-2.1
- For more information, see the Informatica PowerExchange for Hadoop 9.6.0 User Guide for PowerCenter.
- PowerExchange for Microsoft Dynamics CRM
- - You can use Microsoft Dynamics CRM Online version 2013 for online deployment.
- - You can configure the number of rows that you want to retrieve from Microsoft Dynamics CRM.
- - You can join two related entities that have one to many or many to one relationships.
- - PowerExchange for Microsoft Dynamics CRM uses HTTP compression to extract data if HTTP compression is enabled in the Internet Information Services (IIS) where Microsoft Dynamics CRM is installed.
- - You can configure the PowerCenter Integration Service to write records in bulk mode.
- - You can change the location of the krb5.conf file and the login.conf files at run time.
- For more information, see the Informatica PowerExchange for Microsoft Dynamics CRM 9.6.0 User Guide for PowerCenter.
- PowerExchange for SAP NetWeaver
- - PowerExchange for SAP NetWeaver uses SAP NetWeaver RFC SDK 7.20 libraries.
- - You can enable partitioning for SAP BW sessions that load data to 7.x DataSources. When you enable partitioning, the PowerCenter Integration Service performs the extract, transform, and load for each partition in parallel.
- - You can run ABAP stream mode sessions with the Remote Function Call communication protocol.
- - You can install secure transports to enforce security authorizations when you use ABAP to read data from SAP.
- For more information, see the Informatica PowerExchange for SAP 9.6.0 User Guide for PowerCenter.
- PowerExchange for SAS
- You can read data directly from a SAS data file.
- For more information, see the Informatica PowerExchange for SAS 9.6.0 User Guide for PowerCenter.
- PowerExchange for Siebel
- When you import Siebel business components, you can specify the name of the Siebel repository if multiple Siebel repositories are available. You can create and configure the connection.properties file to add the Repository Name field to the Import from Siebel wizard in PowerExchange for Siebel.
- For more information, see the Informatica PowerExchange for Siebel 9.6.0 User Guide for PowerCenter.
- PowerExchange for Teradata Parallel Transporter API
- - You can configure a session so that Teradata PT API uses one of the spool modes to extract data from Teradata.
- - You can configure a session to use a character in place of an unsupported Teradata unicode character while loading data to targets.
- For more information, see the Informatica PowerExchange for Teradata Parallel Transporter API 9.6.0 User Guide for PowerCenter.
- PowerExchange for Web Services
- - The PowerCenter Integration Service can process SOAP 1.2 messages with RPC/encoded and document/literal encoding styles. Each web service can have an operation that uses a SOAP 1.2 binding. You can create a Web Service Consumer transformation with a SOAP 1.2 binding.
- - You can use PowerExchange for Web Services with SharePoint 2010 and 2013 as a web service provider.
- For more information, see the Informatica PowerExchange for Web Services 9.6.0 User Guide for PowerCenter.
PowerExchange Adapters for Informatica
This section describes new features and enhancements to PowerExchange adapters for Informatica.
- PowerExchane for HBase
- PowerExchange for HBase provides connectivity to an HBase data store. Use PowerExchange for HBase to read data from the HBase columns families or write data to the columns families in an HBase table. You can read or write data to a column family or a single binary column.
- You can add an HBase data object operation as a source or as a target in a mapping and run the mappings in the native or a Hive environment.
- For more information, see the PowerExchange for HBase 9.6.0 User Guide.
- PowerExchange for DataSift
- You can configure the HTTP proxy server authentication settings at design time.
- For more information, see the Informatica PowerExchange for DataSift 9.6.0 User Guide.
- PowerExchange for Facebook
- - You can extract information about a group, news feed of a group, list of members in a group, basic information about a page, and news feed from a page from Facebook.
- - You can configure the HTTP proxy server authentication settings at design time.
- For more information, see the Informatica PowerExchange for Facebook 9.6.0 User Guide.
- PowerExchange for HDFS
- - PowerExchange for HDFS supports the following Hadoop distributions to access HDFS sources and targets:
- ▪ CDH Version 4 Update 2
- ▪ HortonWorks 1.3.2
- ▪ MapR 2.1.3
- ▪ MapR 3.0.1
- - You can write text files and binary file formats, such as sequence files, to HDFS with a complex file data object.
- - You can write compressed complex files, specify compression formats, and decompress files.
- - The Data Integration Service creates partitions to read data from sequence files and custom input format files that can be split.
- For more information, see the Informatica PowerExchange for HDFS 9.6.0 User Guide.
- PowerExchange for Hive
- - PowerExchange for Hive supports the following Hive distributions to access Hive sources and targets:
- ▪ Cloudera CDH Version 4 Update 2
- ▪ HortonWorks 1.3.2
- ▪ MapR 2.1.3
- ▪ MapR 3.0.1
- - You can write to Hive partitioned tables when you run mappings in a Hive environment.
- PowerExchange for LinkedIn
- - You can specify the full name of a person when you look up company information in LinkedIn.
- - You can configure the HTTP proxy server authentication settings at design time.
- For more information, see the Informatica PowerExchange for LinkedIn 9.6.0 User Guide.
- PowerExchange for Salesforce
- - You can select specific records from Salesforce by using the filter from the query property of the Salesforce data object read operation.
- - You can use a Salesforce data object read operation to look up data in a Salesforce object.
- - You can configure the HTTP proxy server authentication settings at design time.
- For more information, see the Informatica PowerExchange for Salesforce 9.6.0 User Guide.
- PowerExchange for SAP NetWeaver
- - PowerExchange for SAP NetWeaver uses SAP NetWeaver RFC SDK 7.20 libraries.
- - You can install secure transports to enforce security authorizations when you use ABAP to read data from SAP.
- For more information, see the Informatica PowerExchange for SAP 9.6.0 User Guide.
- PowerExchange for Twitter
- - You can specify a list of user IDs or screen names in a .txt or .csv format to extract the profiles of users. You can specify a valid user ID or a screen name to extract the profile of a user.
- - You can configure the HTTP proxy server authentication settings at design time.
- For more information, see the Informatica PowerExchange for Twitter 9.6.0 User Guide.
- PowerExchange for Web Content-Kapow Katalyst
- You can configure the HTTP proxy server authentication settings at design time.
- For more information, see the Informatica PowerExchange for LinkedIn 9.6.0 User Guide.
Informatica Documentation
This section describes new guides included with the Informatica documentation. Some new guides are organized based on shared functionality among multiple products and replace previous guides.
The Informatica documentation contains the following new guides:
- Informatica Analyst Tool Guide
- Contains general information about Informatica Analyst (the Analyst tool). Previously, the Analyst tool was documented in the Informatica Data Integration Analyst User Guide.
- Informatica Application Service Guide
- Contains information about application services. Previously, the application services were documented in the Informatica Administrator Guide.
- Informatica Connector Toolkit Developer Guide
- Contains information about the Informatica Connector Toolkit and how to develop an adapter for the Informatica platform. You can find information on components that you define to develop an adapter such as connection attributes, type system, metadata objects, and run-time behavior.
- Informatica Connector Toolkit Getting Started Guide
- Contains a tutorial on how to use the Informatica Connector Toolkit to develop a sample MySQL adapter for the Informatica platform. You can find information on how to install Informatica Connector Toolkit and on how to create and publish a sample MySQL adapter with the Informatica Connector Toolkit.
- Informatica Data Explorer Data Discovery Guide
- Contains information about discovering the metadata of source systems that include content and structure. You can find information on column profiles, data domain discovery, primary key and foreign key discovery, functional dependency discovery, Join analysis, and enterprise discovery. Previously, data discovery was documented in the Informatica Data Explorer User Guide.
- Informatica Business Glossary Guide
- Contains information about Business Glossary. You can find information about how to manage and look up glossary content in the Analyst Tool. Glossary content includes terms, policies, and categories. Previously, information about Metadata Manager Business Glossary was documented in the Informatica PowerCenter Metadata Manager Business Glossary Guide.
- Informatica Data Quality Exception Management Guide
- Contains information about exception management for Data Quality. You can find information about managing exception record tasks in the Analyst tool. Previously, exception management was documented in the Informatica Data Director for Data Quality Guide, Data Quality User Guide, and Data Services User Guide.
- Informatica Database View Reference
- Contains information about Model Repository views, Profile Warehouse views, and Business Glossary views. Previously, this book was called the Informatica Data Services Model Repository Views and the profile views were documented in an H2L article. The Business Glossary views is the new content added in this book.
- Informatica Developer Tool Guide
- Contains information about Informatica Developer. You can find information on common functionality in the Developer tool. Previously, the Developer tool was documented in the Informatica Developer User Guide.
- Informatica Mapping Guide
- Contains information about configuring Model repository mappings. Previously, the mapping configuration was documented in the Informatica Developer User Guide.
- Informatica Mapping Specifications Getting Started Guide
- Contains getting started information for mapping specifications.
- Informatica Mapping Specifications Guide
- Contains information about mapping specifications. Previously, the mapping specifications were documented in the Informatica Data Integration Analyst User Guide.
- Informatica Profile Guide
- Contains information about profiles. The guide contains basic information about running column profiles, creating rules, and creating scorecards. Previously, profiling was documented in the Data Quality User Guide and Informatica Data Explorer User Guide.
- Informatica Reference Data Guide
- Contains information about reference data objects. A reference data object contains a set of data values that you can use to perform search operations in source data. You can create reference data objects in the Developer tool and Analyst tool, and you can import reference data objects to the Model repository. Previously, reference data objects were documented in the Informatica Data Quality User Guide.
- Informatica Rule Builder Guide
- Contains information about the Rule Builder feature in the Analyst tool. Use Rule Builder to describe business rule requirements as a series of logical statements. You compile the logical statements into a rule specification. The Analyst tool saves a copy of the rule specification as a mapplet in the Model repository.
- Informatica Security Guide
- Contains information about security for the Informatica domain. Previously, Informatica security was documented in the Informatica Administrator Guide.
- Informatica SQL Data Service Guide
- This manual contains information about creating SQL data services, populating virtual data and connecting to an SQL data service with third party tools. Previously, this book was called the Informatica Data Services User Guide.