Preface
Informatica-Ressourcen
Informatica Network
Informatica-Wissensdatenbank
Informatica-Dokumentation
Informatica-Produktverfügbarkeitsmatrizen
Informatica Velocity
Informatica Marketplace
Globaler Kundensupport von Informatica
Introduction to Data Engineering Administration
Data Engineering Integration Engines
Hadoop Integration
Hadoop Utilities
High Availability
Run-time Process on the Blaze Engine
Run-time Process on the Spark Engine
Databricks Integration
Run-time Process on the Databricks Spark Engine
Application Services
Data Integration Service Process
Security
Connections
Authentication
Authentication Overview
Support for Authentication Systems on Hadoop
Authentication with Kerberos
User Impersonation
Authentication with Apache Knox Gateway
Configuring Apache Knox for Cloudera CDP Public Cloud
Running Mappings on a Cluster with Kerberos Authentication
Running Mappings with Kerberos Authentication Overview
Running Mappings in a Kerberos-Enabled Hadoop Environment
Step 1. Set Up the Kerberos Configuration File on the Domain Host
Step 2. Set up the Cross-Realm Trust
Step 3. Create Matching Operating System Profile Names
Step 4. Create the Principal Name and Keytab Files in the Active Directory Server
Step 5. Specify the Kerberos Authentication Properties for the Data Integration Service
Step 6. Configure the Execution Options for the Data Integration Service
Step 7. Configure the Developer Tool
User Impersonation with Kerberos Authentication
User Impersonation in the Hadoop Environment
User Impersonation in the Native Environment
Running Mappings in the Native Environment
Configure the Analyst Service
Authorization
Authorization Overview
Support for Authorization Systems on Hadoop
HDFS Permissions
SQL Authorization for Hive
Key Management Servers
Configuring KMS for Informatica User Access
Configuring Access to an SSL/TLS-Enabled Cluster
Configure the Hive Connection for SSL-Enabled Clusters
Import Security Certificates from an SSL-Enabled Cluster
Rules and Guidelines for Importing Security Certificates from an SSL-Enabled Cluster
Import Security Certificates from a TLS-Enabled Domain
Configuring Access to an SSL-Enabled Database
Configure the JDBC Connection for SSL-Enabled Databases
Configuring Sqoop Connectivity to an SSL-Enabled Oracle Database
Cluster Configuration
Cluster Configuration Overview
Cluster Configuration and Connections
Copying a Connection to Another Domain
Cluster Configuration Views
Active Properties View
Overridden Properties View
Create a Hadoop Cluster Configuration
Before You Import
Importing a Hadoop Cluster Configuration from the Cluster
Importing a Hadoop Cluster Configuration from a File
Create a Databricks Cluster Configuration
Importing a Databricks Cluster Configuration from the Cluster
Importing a Databricks Cluster Configuration from a File
Edit the Cluster Configuration
Filtering Cluster Configuration Properties
Overriding Imported Properties
Creating User-Defined Properties
Deleting Cluster Configuration Properties
Refresh the Cluster Configuration
Example - Cluster Configuration Refresh
Delete a Cluster Configuration
Cluster Configuration Privileges and Permissions
Privileges and Roles
Permissions
Cloud Provisioning Configuration
Cloud Provisioning Configuration Overview
Verify Prerequisites
Enable DNS Resolution from an On-Premises Informatica Domain
AWS Cloud Provisioning Configuration Properties
General Properties
Permissions
EC2 Configuration
Azure Cloud Provisioning Configuration Properties
Authentication Details
Storage Account Details
Cluster Deployment Details
External Hive Metastore Details
Databricks Cloud Provisioning Configuration Properties
Create the Cloud Provisioning Configuration
Complete the Azure Cloud Provisioning Configuration
Create a Cluster Connection
Data Integration Service Processing
Overview of Data Integration Service Processing
Datenintegrationsdienst Queueing
Execution Pools
Data Engineering Recovery
Scenarios Where Recovery is Possible
Scenarios Where Recovery is Not Possible
Recovery Job Management
Monitoring Recovered Jobs
Data Engineering Recovery Configuration
Tuning for Data Engineering Job Processing
Deployment Types
Tuning the Application Services
Tuning the Hadoop Run-time Engines
Autotune
Connections Reference
Verbindungen – Übersicht
Cloud Provisioning Configuration
AWS Cloud Provisioning Configuration Properties
Azure Cloud Provisioning Configuration Properties
Databricks Cloud Provisioning Configuration Properties
Amazon Redshift-Verbindungseigenschaften
Amazon S3-Verbindungseigenschaften
Eigenschaften der Blockchain-Verbindung
Cassandra-Verbindungseigenschaften
Confluent-Kafka-Verbindung
Allgemeine Eigenschaften
Eigenschaften des Confluent-Kafka-Brokers
SSL-Eigenschaften
Erstellen einer Confluent-Kafka-Verbindung mit infacmd
Databricks-Verbindungseigenschaften
Google Analytics-Verbindungseigenschaften
Google BigQuery-Verbindungseigenschaften
Google Cloud Spanner-Verbindungseigenschaften
Google Cloud Storage-Verbindungseigenschaften
Google PubSub-Verbindungseigenschaften
Hadoop-Verbindungseigenschaften
Eigenschaften des Hadoop-Clusters
Allgemeine Eigenschaften
Eigenschaften des Ablehnungsverzeichnisses
Blaze-Konfiguration
Spark-Konfiguration
HDFS-Verbindungseigenschaften
HBase-Verbindungseigenschaften
HBase-Verbindungseigenschaften für MapR-DB
Hive-Verbindungseigenschaften
Eigenschaften der JDBC-Verbindung
JDBC Connection String
Sqoop Connection-Level Arguments
Delta Lake JDBC Connection Properties
Eigenschaften der JDBC V2-Verbindung
Kafka Connection Properties
Allgemeine Eigenschaften
Eigenschaften des Kafka-Brokers
SSL-Eigenschaften
Erstellen einer Kafka-Verbindung mit infacmd
Kudu-Verbindungseigenschaften
Eigenschaften der Microsoft Azure Blob Storage-Verbindung
Microsoft Azure Cosmos DB SQL API-Verbindungseigenschaften
Eigenschaften der Microsoft Azure Data Lake Storage Gen1-Verbindung
Eigenschaften der Microsoft Azure Data Lake Storage Gen2-Verbindung
Eigenschaften der Microsoft Azure SQL Data Warehouse-Verbindung
Snowflake-Verbindungseigenschaften
Creating a Connection to Access Sources or Targets
Creating a Hadoop Connection
Configuring Hadoop Connection Properties
Cluster Environment Variables
Cluster Library Path
Common Advanced Properties
Blaze Engine Advanced Properties
Spark Advanced Properties
Monitoring REST API
Monitoring REST API Overview
Monitoring Metadata Document
Sample Metadata Document
ClusterStats
Retrieve Cluster Statistics
MappingStats
Sample Retrieve Mapping Statistics
MappingAdvancedStats
Sample Retrieve Advanced Mapping Statistics
MappingExecutionSteps
Sample Retrieve Mapping Execution Steps
MappingExecutionPlans
Sample Retrieve Mapping Execution Plans
Preface
MappingExecutionPlans
ClusterConfigHelp
CCOPermissions
cloudProvisioningConnectionPropertiesHelp
CloudProvisionHelp
DB_AR_CONN
FS_AS3_CONN
BLOCKCHAIN_CONN
NOSQL_CASSANDRAJDBC_CONN
DES_CFKAFKA_CONN
EA_GOOGLEANALYTICS_CONN
EA_GOOGLEBIGQUERY_CONN
EA_GOOGLECLOUDSPANNER_CONN
EA_GOOGLECLOUDSTORAGE_CONN
DES_PUBSUB_CONN
hadoopConnectionPropertiesHelp
CLUSTER_HADOOP_CONN
FS_HDFS_CONN
NOSQL_HBASE_CONN
DB_HIVE_CONN
DB_JDBC_CONN
DB_JDBCV2_CONN
FS_KUDU_CONN
FS_AZUREBLOBSTORAGE_CONN
NOSQL_AZURECOSMOSDB_CONN
FS_AZUREADLS_CONN
FS_AZUREADLSGEN2_CONN
DB_AZURESQLDW_CONN
DB_SF_CONN