Part IV: Version 9.5.0 > New Features and Enhancements (9.5.0) > Version 9.5.0
  

Version 9.5.0

This section describes new features and enhancements in version 9.5.0.

Informatica Installer

This section describes new features and enhancements to the Informatica platform installer.

Install Application Client Components

You can specify the Informatica application client components that you want to install. For example, you can install all of the application clients or a subset of application clients.

Pre-Installation (i9Pi) System Check Tool

Before you install or upgrade the Informatica services, you can run the Pre-installation (i9Pi) System Check Tool to verify that the machine meets the minimum system and database requirements for the installation.

Uninstall Application Client Components

You can specify the Informatica application client components that you want to uninstall.

Informatica Data Explorer

This section describes new features and enhancements to Informatica Data Explorer.

Connections

Version 9.5.0 includes the following enhancements for connections:

Connectivity to SAP HANA

You can connect to an SAP HANA database using ODBC.

Data Domain Discovery

You can identify critical data characteristics within the enterprise so that you can apply further data management policies, such as data masking or data quality, to the data. Run a profile to identify all the data domains for a column based either on its values or name. A data domain is the logical datatype of a column or a set of allowed values it may have. The name of the data domain helps you find the functional meaning of the column data.
You can perform data domain discovery in both Analyst tool and Developer tool.

Enterprise Discovery

You can run multiple data discovery tasks on a large number of data sources across multiple connections and generate a consolidated results summary of the profile results. This data discovery method includes running a column profile, data domain discovery, and discovering primary key and foreign key relationships. You can view the results in both graphical and tabular formats.
You can run enterprise discovery from a profile model in the Developer tool.

Find in Editor

In the Developer tool, you can search for attributes, columns, expressions, groups, ports, or transformations in any type of mapping editor, in a logical data object editor, in a mapplet editor, or in a workflow editor.

Project Permissions

You can assign read, write, and grant permissions to users and groups when you create a project and when you edit project details.

Scorecards

Informatica Data Quality

This section describes new features in Informatica Data Quality.

Address Validator Transformation

The Address Validator transformation can perform consumer marketing and segmentation analysis on address data. Select the CAMEO options in the transformation to perform consumer marketing and segmentation analysis.
The Address Validator transformation can add Enhanced Line of Travel (eLOT) data to a United States address. Mail carriers use eLOT data to sort mail items in the order in which they are likely to be delivered on a mail route. The Address Validator transformation runs in Certified mode when it creates eLOT output.

Connections

Version 9.5.0 includes the following enhancements for connections:

Connectivity to SAP HANA

You can connect to an SAP HANA database using ODBC.

Content Management Service

The Content Management Service has the following features:

Data Masking Transformation

The Data Masking transformation contains the following data masking techniques:

Find in Editor

In the Developer tool, you can search for attributes, columns, expressions, groups, ports, or transformations in any type of mapping editor, in a logical data object editor, in a mapplet editor, or in a workflow editor.

Import from PowerCenter

You can import objects from a PowerCenter repository to a Model repository. You can connect to a PowerCenter repository from the Developer tool and select objects to import into a target location in the Model repository. The import process validates and converts the PowerCenter objects to Model repository objects based on compatibility. You can check feasibility of the import before the final import. The Developer tool creates a final summary report with the results of the import.

Decision Transformation

The Decision transformation handles integer values in IF/ELSE statements in addition to boolean values. The transformation processes a 0 value as False and other integer values as True.

Informatica Data Director for Data Quality

Informatica Data Director for Data Quality is a web-based application that you use to review the bad record and duplicate record output from an Exception transformation. You can edit bad records, and you can consolidate duplicate records into a single master record. You use Informatica Data Director for Data Quality to complete a Human task in a workflow. When you log on to the application, Informatica Data Director for Data Quality connects to the database tables specified in the workflow and displays the tasks to perform.

Mapping and Mapplet Editors

The following table lists Developer tool options for mapping and mapplet editors:
Option
Description
Shift + resize an object
After you resize the object, the Developer tool arranges all objects so that no objects are overlapping.
Align All to Grid
The Developer tool aligns all objects in the editor based on data flow.
Restore All
When an editor contains iconized objects, the Developer tool restores the objects to their original sizes without overlapping them.

Project Permissions

You can assign read, write, and grant permissions to users and groups when you create a project and when you edit project details.

Probabilistic Models

A probabilistic model is a content set that you can use to identify data values on input ports that contain one or more values in a delimited string. A probabilistic model uses probabilistic matching logic to identify data values by the types of information the values contain. You can use a probabilistic model in Labeler and Parser transformations.
You create a probabilistic model in the Developer tool. You select the model from a project folder in the Model repository. The Developer tool writes probabilistic model data to a file you specify in the Content Management Service.

Scorecards

System Mapping Parameters

System mapping parameters are constant values that define the directories where the Data Integration Service stores cache files, reject files, source files, target files, and temporary files. You define the values of the system parameters on a Data Integration Service process in the Administrator tool. By default, the system parameters are assigned to flat file directory, cache file directory, and temporary file directory fields.

Workflows

A workflow is a graphical representation of a set of events, tasks, and decisions that define a business process. You use the Developer tool to add objects to a workflow and to connect the objects with sequence flows. The Workflow Service Module is the component in the Data Integration Service that uses the instructions configured in the workflow to run the objects.
A workflow can contain the following objects:
A sequence flow connects workflow objects to specify the order that the Data Integration Service runs the objects. You can create a conditional sequence flow to determine whether the Data Integration Service runs the next object.
You can define and use workflow variables and parameters to make workflows more flexible. A workflow variable represents a value that records run-time information and that can change during a workflow run. A workflow parameter represents a constant value that you define in a parameter file before running a workflow.
After you validate a workflow to identify errors, you add the workflow to an application and deploy the application to a Data Integration Service. You run an instance of the workflow from the deployed application using the infacmd wfs command line program. You monitor the workflow instance run in the Monitoring tool.

Informatica Data Services

This section describes new features and enhancements to Informatica Data Services.

Business Intelligence Tools

You can query published data services with the OBIEE 11.1.1.5 or 11.1.13, Toad for Data Analysts, and MS Sql Server Reporting Service business intelligence tools.

Connections

Version 9.5.0 includes the following enhancements for connections:

Connectivity to SAP HANA

You can connect to an SAP HANA database using ODBC.

Data Processor Transformation

You can configure a Data Transformation service in the Developer tool by configuring it in a Data Processor transformation. Create a script in the IntelliScript editor or configure an XMap to map input XML to output XML in the transformation. Add a Data Processor transformation to a mapping or export the transformation as a service to a Data Transformation repository.

Find in Editor

In the Developer tool, you can search for attributes, columns, expressions, groups, ports, or transformations in any type of mapping editor, in a logical data object editor, in a mapplet editor, or in a workflow editor.

Import from PowerCenter

You can import objects from a PowerCenter repository to a Model repository. You can connect to a PowerCenter repository from the Developer tool and select objects to import into a target location in the Model repository. The import process validates and converts the PowerCenter objects to Model repository objects based on compatibility. You can check feasibility of the import before the final import. The Developer tool creates a final summary report with the results of the import.

Mapping and Mapplet Editors

The following table lists Developer tool options for mapping and mapplet editors:
Option
Description
Shift + resize an object
After you resize the object, the Developer tool arranges all objects so that no objects are overlapping.
Align All to Grid
The Developer tool aligns all objects in the editor based on data flow.
Restore All
When an editor contains iconized objects, the Developer tool restores the objects to their original sizes without overlapping them.

Mapping Specifications

Version 9.5.0 includes the following enhancements for mapping specifications in the Analyst tool:

Performance

Version 9.5.0 includes the following performance enhancements:

Project Permissions

You can assign read, write, and grant permissions to users and groups when you create a project and when you edit project details.

Row Level Security

Administrators can assign security predicates on virtual tables to restrict access to rows of data when users query the tables.

Scorecards

SQL Data Services

Version 9.5.0 includes the following enhancements for SQL data services:

System Mapping Parameters

System mapping parameters are constant values that define the directories where the Data Integration Service stores cache files, reject files, source files, target files, and temporary files. You define the values of the system parameters on a Data Integration Service process in the Administrator tool. By default, the system parameters are assigned to flat file directory, cache file directory, and temporary file directory fields.

Web Services

Web Service Consumer Transformation
Version 9.5.0 includes the following enhancements for the Web Service Consumer transformation:
Generic Fault
You can define a generic fault to return an error message to a web service client when an error is not defined by a fault element in the WSDL. Create a Fault transformation to return a generic error message.
Schema Objects
Version 9.5.0 includes the following enhancements for schema objects:
Hierarchy Level of Elements
You can change the hierarchy of the elements in an operation mapping.
Operations
You can create and configure operations in the web service Overview view. After you manually create a web service, you can create an operation from a reusable object.
SOAP 1.2
The Data Integration Service can process SOAP 1.2 messages with document/literal encoding. Each web service can have an operation that uses a SOAP 1.2 binding. When you create a fault using SOAP 1.2, the wizard creates the code, reason, node, and role elements.
WSDL Synchronization
You can synchronize a WSDL data object when the WSDL files change. When you synchronize a WSDL data object, the Developer tool re-imports the object metadata from the WSDL files. The Developer tool also updates objects that reference the WSDL or marks them as changed when you open them.

Informatica Data Transformation

Effective in version 9.5.0, Data Transformation moved to the Informatica platform. You can now create and test a Data Transformation service in the Developer tool. Create a Data Processor transformation that include script objects or XMAP objects to transform data. Create a script in the Data Processor transformation Script editor. A script can contain Parsers, Serializers, Mappers, Transformers, and Streamer components. Define an XMap in the transformation XMap editor. Define an XMap to map input XML to output XML. You can add a Data Processor transformation to a mapping or export the transformation as a service to a Data Transformation repository.
You can import a Data Transformation project into a Data Processor transformation to upgrade a script from Data Transformation version 9.1.0. You can also deploy a Data Transformation project as a service, and then import the service to a Data Processor transformation.

Informatica Development Platform

This section describes new features and enhancements to Informatica Development Platform.

Design API

Version 9.5.0 includes the following enhancements for the Design API:

Informatica Domain

This section describes new features and enhancements to the Informatica domain.

Connection Management

You can rename connections.

Data Director Service

The Informatica Data Director Service is an application service that runs Informatica Data Director for Data Quality in the Informatica domain. Create and enable an Informatica Data Director Service on the Domain tab of Informatica Administrator.
When you enable the Informatica Data Director Service, the Service Manager starts Informatica Data Director for Data Quality. You can open Informatica Data Director for Data Quality in a web browser.

Data Integration Service

Directories for Data Integration Service Files
You can configure the Data Integration Service process properties that define where the service stores files.
The following table describes the Data Integration Service process properties:
Property
Description
Home Directory
Root directory accessible by the node. This is the root directory for other service process variables. Default is <Informatica Services Installation Directory>/tomcat/bin.
Log Directory
Directory for log files. Default is <home directory>/disLogs.
Cache Directory
Directory for index and data cache files for transformations. Default is <home directory>/Cache.
Source Directory
Directory for source flat files used in a mapping. Default is <home directory>/source.
Target Directory
Default directory for target flat files used in a mapping. Default is <home directory>/target.
Rejected Files Directory
Directory for reject files. Reject files contain rows that were rejected when running a mapping. Default is <home directory>/reject.
Out of Process Execution
You can run each Data Integration Service job as a separate operating system process. Each job can run separately without affecting other jobs running on the Data Integration Service. For optimal performance, run batch jobs and long jobs out of process, such as preview, profile, scorecard, and mapping jobs.
Email Server Properties
You can configure email server properties for the Data Integration Service. The email server properties configure the SMTP server that the Data Integration Service uses to send email notifications from a workflow.
Grid
You can run the Data Integration Service on a grid. When you run an object on a grid, you improve scalability and performance by distributing the work across multiple DTM processes running on nodes in the grid.
Human Task Service Module
The Human Task Service Module is the component in the Data Integration Service that manages requests to run a Human task in a workflow.
Logical Data Object Properties
If you want to manage the data object cache with an external tool, specify a cache table name for each logical data object. When you specify a cache table name, the external tool that you configure populates, purges, and refreshes the cache.
Logical Data Object Column Property
You can configure the Data Integration Service to generate indexes for the cache table based on columns in a logical data object. The indexes can increase the performance of queries on the cache database.
Optimizer Level
You can configure the optimizer level in Data Integration Service application properties for an SQL data service or a web service. The optimizer level determines which optimization methods the Data Integration Service applies to the SQL data service query or to the web service request at run time.
SQL Properties
You can configure the SQL properties for the Data Integration Service.
The following table describes the SQL properties:
Property
Description
DTM Keep Alive Time
Number of milliseconds that the DTM process stays open after it completes the last request. Identical SQL queries can reuse the open process.
You can set this property globally or for each SQL data service that is deployed to the Data Integration Service.
Table Storage Connection
Relational database connection that stores temporary tables for SQL data services.
Skip Log Files
Prevents the Data Integration Service from generating log files when the SQL data service request completes successfully and the tracing level is set to INFO or higher.
Virtual Table Properties
If you want to manage the data object cache with an external tool, specify a cache table name for each virtual table. When you specify a cache table name, the external tool that you configure populates, purges, and refreshes the cache.
Virtual Table Column Property
You can configure the Data Integration Service to generate indexes for the cache table based on columns in a virtual table. The indexes can increase the performance of queries on the cache database.
Web Service Properties
You can configure the web service properties for the Data Integration Service.
The following table describes the web service properties:
Property
Description
DTM Keep Alive Time
Number of milliseconds that the DTM process stays open after it completes the last request. Web service requests that are issued against the same operation can reuse the open process.
You can set this property globally or for each web service that is deployed to the Data Integration Service.
Logical URL
Prefix for the WSDL URL if you use an external HTTP load balancer.
Skip Log Files
Prevents the Data Integration Service from generating log files when the web service request completes successfully and the tracing level is set to INFO or higher.
Workflow Service Module
The Workflow Service Module is the component in the Data Integration Service that manages requests to run workflows.

Monitoring

You can monitor a workflow instance run in the Monitoring tab of the Administrator tool. You can view the status of running workflow and workflow object instances. You can abort or cancel a running workflow instance. You can also view workflow reports, workflow logs, and mapping logs for mappings run by Mapping tasks in the workflow.

PowerExchange Listener Service

You can configure PowerExchange so that Data Integration Service workflows connect to a PowerExchange Listener through a PowerExchange Listener Service.
If the NODE statement in the DBMOVER configuration file on a Data Integration Service node includes the service_name parameter, the Data Integration Service ignores the host_name parameter on the NODE statement and uses the service_name and port parameters to connect to the Listener Service that manages the PowerExchange Listener process.
The function of the NODE statement did not change for PowerCenter Integration Service workflows.

Profile Privilege

Assign the Manage Data Domains Model Repository Service privilege to enable a user to create, edit, and delete data domains in the data domain glossary.

Security

Command Line Programs

This section describes new commands and options for the Informatica command line programs.

infacmd cms Commands

The following table describes new infacmd cms commands:
Command
Description
CreateAuditTables
Creates audit trail tables that record any change made to probabilistic model content sets.
DeleteAuditTables
Deletes audit trail tables that record any change made to probabilistic model content sets.
ResyncData
Synchronizes the probabilistic model content set files on a Content Management Service machine with the files on the master Content Management Service machine.
Upgrade
Updates a Content Management Service to version 9.5.0. When you run infacmd cms upgrade, the command updates the following properties on the service: Master CMS, Model Repository Service, Reference Data Location.
The following table describes an updated infacmd cms command:
Command
Description
CreateService
Contains the following new options:
  • - -RepositoryService (-rs). Specifies a Model Repository Service to associate with the Content Management Service.
  • - -ReferenceDataLocation (-rdl). Connection name of the database that stores data values for the reference tables defined in the Model repository.
Note: -RepositoryService and -ReferenceDataLocation are required options. Update scripts that use the CreateService command before you run them in an Informatica 9.5.0 environment.

infacmd dis Commands

The following table describes updated commands:
Command
Description
CreateService
Contains the following new options:
  • - -GridName (-gn). The name of the grid on which the Data Integration Service runs.
  • - -HttpsPort. Unique HTTPS port number used for each Data Integration Service process.
  • - -KeystoreFile (-kf). Path and file name of the keystore file that contains the keys and certificates required if you use the HTTPS protocol for the Data Integration Service.
  • - -KeystorePassword (-kp). Password for the keystore file.
PurgeDataObjectCache
Deletes all cache for a logical data object, including the latest cache run if the latest cache run has exceeded the cache refresh period. Previously, this command deleted all cache for a logical data object except the latest cache run.
Contains new option -PurgeAll (-pa). This option deletes all cache for a logical data object.
UpdateDataObjectOptions
Contains new data object option DataObjectOptions.RefreshDisabled. This option specifies the name of the table that the Data Integration Service uses to cache the logical data object.
UpdateServiceOptions
Contains the following new options:
  • - -NodeName (-nn). The name of the node on which the Data Integration Service runs.
  • - -GridName (-gn). The name of the grid on which the Data Integration Service runs.
Contains the following changed option:
  • - -Option (-o). This argument is optional. Previously, this argument was required.
Contains the following new Data Integration Service options:
  • - HTTPConfigurationOptions.HTTPProtocolType. Security protocol that the Data Integration Service uses: HTTP, HTTPS, or Both.
  • - WSServiceOptions.DTMKeepAliveTime. Sets the keepalive time for all web services that are deployed to the Data Integration Service.
Contains the following changed Data Integration Service options:
  • - WSServiceOptions.<option name>. Specifies the web service options. Previously, the web service options were named "WebServiceOptions.<option name>."
  • - WebServiceOptions.RequestResourceBufferSize. This option is removed.
If you created scripts that use the changed Data Integration Service options, you must update the scripts.

infacmd idd Commands

infacmd idd commands manage the Data Director Service. The Data Director Service runs the Informatica Data Director for Data Quality web application.
The following table describes new infacmd idd commands:
Command
Description
CreateService
Creates an Informatica Data Director Service in the domain.
ListServiceOptions
Lists the options for a Data Director Service.
ListServiceProcessOptions
Lists the options for a Data Director Service process.
RemoveService
Removes the Data Director Service from the domain.
UpdateServiceOptions
Updates service options for the Data Director Service.
UpdateServiceProcessOptions
Updates service process options for the Data Director Service.

infacmd ipc Commands

The following table describes a new command:
Command
Description
ImportFromPC
Converts a PowerCenter repository object XML file to a Model repository object XML file.
The following table describes an updated command:
Command
Description
CreateConnection
Contains new option -ConnectionId (-cid). This option specifies the string that the Data Integration Service uses to identify the connection.

infacmd isp Commands

The following table describes new commands:
Command
Description
RenameConnection
Renames a connection.
ValidateFeature
Validates that the feature in the specified plug-in file is registered in the domain.
The following table describes an updated command:
Command
Description
ImportDomainObjects
The merge conflict resolution strategy for option -ConflictResolution (-cr) is removed. You can still specify the merge strategy for groups in the import control file.
If you created scripts that use the merge conflict resolution strategy, you must update the scripts.

infacmd oie Commands

The following table describes updated commands:
Command
Description
Export
Import
Contain new option -OtherOptions (-oo). This option specifies the options you can set when you import or export data files.
You can set an option for a probabilistic model file in the rtm group. The possible values are "full" or "trainedOnly."
The following options select trained probabilistic model files:
rtm:disName=ds,codePage=UTF-8,refDataFile=/folder1/data.zip,pm=trainedOnly

infacmd ps Commands

The following table describes new commands:
Command
Description
cancelProfileExecution
Cancels the profile model run.
executeProfile
Runs the profile model.
getProfileExecutionStatus
Gets the run-time status of a profile model.
migrateScorecards
Migrates scorecard results from Informatica 9.1.0 to 9.5.0.

infacmd rtm Commands

The following table describes updated commands:
Command
Description
DeployImport
Contains the following changed options:
  • - -ConflictResolution (-cr). This option is removed.
  • - -DataIntegrationService (-ds). Identifies the Data Integration Service. Previously, you used the -DsServiceName (-dsn) option.
  • - -Folder (-f). Identifies a folder on the machine that runs the command. Previously, this option identified a folder on a Data Integration Service machine.
  • - -StagingDbName (-sdb). This option is removed.
Export
Contain changed option -Folder (-f). This option identifies a folder on the machine that runs the command. Previously, this option identified a folder on a Data Integration Service machine.
Import
Contains the following changed options:
  • - -Folder (-f). Identifies a folder on the machine that runs the command. Previously, this option identified a folder on a Data Integration Service machine.
  • - -ImportType (-it). Specifies the type of content to import. The DataOnly argument is deprecated for this option.
  • Use the MetadataAndData argument with the -ImportType option to import reference data into the Model repository and reference data database.
    Use the infacmd oie ImportObjects command to import data to the reference data database only.
If you created scripts that use the changed options, you must update the scripts.

infacmd sql Commands

The following table describes updated commands:
Command
Description
UpdateSQLDataServiceOptions
Contains the following new options:
  • - SQLDataServiceOptions.DTMKeepAliveTime. This option sets the keepalive time for one SQL data service that is deployed to the Data Integration Service.
  • - SQLDataServiceOptions.optimizeLevel. This option sets which optimization methods the Data Integration Service applies to SQL data service queries.
UpdateTableOptions
Contains new data object option VirtualTableOptions.RefreshDisabled. This option specifies the name of the table that the Data Integration Service uses to cache the virtual table.

infacmd wfs Commands

The following table describes new infacmd wfs commands:
Command
Description
ListWorkflowParams
Lists the parameters for a workflow and creates a parameter file that you can use when you run a workflow.
StartWorkflow
Starts an instance of a workflow.

infacmd ws Commands

The following table describes an updated command:
Command
Description
UpdateWebServiceOptions
Contains the following new options:
  • - WebServiceOptions.DTMKeepAliveTime. This option sets the keepalive time for one web service that is deployed to the Data Integration Service.
  • - WebServiceOptions.optimizeLevel. This option sets which optimization methods the Data Integration Service applies to web service requests.

pmrep

The following table describes updated commands:
Command
Description
ExecuteQuery
FindCheckout
ListObjects
ListObjectDependencies
Validate
Contain new option -y. This option displays the database type of sources and targets.

PowerCenter

This section describes new features and enhancements to PowerCenter.

Datatypes

PowerCenter supports the Microsoft SQL Server datetime2 datatype. Datetime2 datatype has a precision of 27 and scale of 7.

Transformation Language

Use the optional argument, match_from_start, with the REG_EXTRACT function to return the substring if a match is found from the start of the string.
The REG_EXTRACT function uses the following syntax:
REG_EXTRACT( subject, 'pattern', subPatternNum, match_from_start )

Connectivity to SAP HANA

You can connect to an SAP HANA database using ODBC.

Metadata Manager

This section describes new features and enhancements to Metadata Manager.

Resources

SAP BW Resource
You can create and configure a SAP BW resource to extract metadata from SAP NetWeaver Business Warehouse.
Custom Resource
You can create and configure custom resources to extract metadata from custom files such as comma separated files. You can create load template files that contain all mapping rules and rule sets used to load the custom resources.

Rule-based Links

Use rule-based links to define rules that Metadata Manager uses to link matching elements between a custom resource type and another custom, packaged, or business glossary resource type. You can also configure rule-based links between a business glossary and a packaged resource type. Configure rule-based links so that you can run data lineage analysis across metadata sources.

Command Line Programs

The following table describes new Metadata Manager commands:
Command
Description
createloadtemplate
Creates a load template file.
generatedefaultloadtemplate
Generates a default load template to load all top level classes for the specified model.
getloadtemplate
Exports a load template file.
deleteloadtemplate
Deletes a load template file.
listloadtemplate
Lists all the load template files for a custom resource.
updateloadtemplate
Updates a load template file.
createlinkruleset
Creates a linking rule set based on a rule set XML file.
updatelinkruleset
Updates a linking rule set based on a modified rule set XML file. If the rule set does not exist, the command creates the rule set.
deletelinkruleset
Deletes a linking rule set.
exportlinkruleset
Exports all linking rule sets for a resource to XML files. You can import the rule sets into another Metadata Manager repository.
importlinkruleset
Imports all linking rule sets from XML files in the specified path into the Metadata Manager repository.

PowerExchange Adapters

This section describes new features and enhancements to PowerExchange adapters in version 9.5.

Adapters for PowerCenter

PowerExchange for Greenplum
PowerExchange for Microsoft Dynamics CRM
PowerExchange for Salesforce
PowerExchange for SAP NetWeaver
PowerExchange for Teradata Parallel Transporter API
PowerExchange for Ultra Messaging

Adapters for Informatica

PowerExchange for Facebook
PowerExchange for LinkedIn
PowerExchange for SAP NetWeaver
PowerExchange for Twitter

Documentation

This section describes new features and enhancements to the documentation.

Documentation DVD

The Informatica Documentation DVD contains product manuals in PDF format. Effective in 9.5.0, the documentation DVD uses a browser-based user interface. Supported browsers are Internet Explorer 7.0 or later and Mozilla Firefox 9.0 or later. Ensure that Javascript support is enabled and the Adobe Acrobat Reader plugin is installed in your browser.

Data Quality User Guide

The Informatica Data Quality User Guide contains information about profiles, reference data, rules, and scorecards. It includes data quality information from the Informatica Data Explorer User Guide, Informatica Data Quality Analyst User Guide, Informatica Developer User Guide, and Informatica Developer Transformation Guide.

Data Processor Transformation Guide

The Informatica Data Processor Transformation Guide contains information that can help you design scripts and XMaps in the Data Processor transformation in the Developer tool and implement them in Data Integration Services. It consolidates information from the Data Transformation Studio Editing Guide, Data Transformation Studio User Guide, and Data Transformation Engine Developer Guide.

Data Services Performance Tuning Guide

The Informatica Data Services Performance Tuning Guide contains information that can help you identify and eliminate bottlenecks and tune the Administrator, Developer, and Analyst tools to improve data services performance.

Data Services User Guide

The Informatica Data Services User Guide contains information about data services, virtual data, queries, and data services configuration. It consolidates information from the Informatica JDBC/ODBC Connection Guide, Informatica SQL Reference, and the Informatica Developer User Guide.

Developer Workflow Guide

The Informatica Developer Workflow Guide describes how to create and configure workflows in the Developer tool.