When you set up a Kafka connection, configure the connection properties.
The following table describes the Kafka connection properties:
Property
Description
Connection Name
Name of the connection.
The name is not case sensitive. It must be unique within the domain. You can change this property after you create the connection. The name cannot exceed 128 characters, contain spaces, or contain the following special characters:
Optional. Description that you use to identity the connection.
The description cannot exceed 4,000 characters.
Type
The Kafka connection type.
If you do not see the connection type, go to the Add-On Connectors page in Administrator to install the connector.
Runtime Environment
Name of the runtime environment where you want to run tasks.
Kafka Broker List
Comma-separated list of the Kafka brokers.
To list a Kafka broker, use the following format:
<HostName>:<PortNumber>
Note: When you connect to a Kafka broker over SSL, you must specify the fully qualified domain name for the host name. Otherwise, the test connection fails with SSL handshake error.
Retry Timeout
Optional. Number of seconds after which the Secure Agent attempts to reconnect to the Kafka broker to read or write data.
Default is 180 seconds.
This property is not used by Database Ingestion and Replication. You can specify an equivalent Kafka property in Additional Connection Properties.
Kafka Broker Version
Kafka message broker version. The only valid value is Apache 0.10.1.1 and above.
Optional for a streaming ingestion and replication task.
Additional Connection Properties
Optional. Comma-separated list of additional configuration properties of the Kafka producer or consumer.
For a streaming ingestion and replication task, ensure that you set the <kerberos name> property if you configure <Security Protocol> as SASL_PLAINTEXT or SASL_SSL.
For a database ingestion and replication task, if you want to specify a security protocol and properties, specify them here instead of in the Additional Security Properties property. For example: security.protocol=SSL,ssl.truststore.location=/opt/kafka/config/kafka.truststore.jks,ssl.truststore.password=<trustore_password>.
Confluent Schema Registry URL
Location and port of the Confluent schema registry service to access Avro sources and targets in Kafka.
To list a schema registry URL, use the following format:
<https>://<HostName or IP>:<PortNumber>
or
<http>://<HostName or IP>:<PortNumber>
Example for the schema registry URL:
https://kafkarnd.informatica.com:8082
or
http://10.65.146.181:8084
Applies only when you import a Kafka topic in Avro format that uses the Confluent schema registry to store the metadata.
This property is not used by Database Ingestion and Replication. You can specify an equivalent Kafka property in Additional Connection Properties.
SSL Mode
Required. Determines the encryption type to use for the connection.
You can choose a mode from the following SSL modes:
- Disabled. Establishes an unencrypted connection to the Kafka broker.
- One-way. Establishes an encrypted connection to the Kafka broker using truststore file and truststore password.
- Two-way. Establishes an encrypted connection to the Kafka broker using truststore file, truststore password, keystore file, and keystore password.
This property is not used by Database Ingestion and Replication. You can specify an equivalent Kafka property in Additional Connection Properties.
SSL TrustStore File Path
Required when you use the one-way or two-way SSL mode.
Absolute path and file name of the SSL truststore file that contains the SSL certificate to connect to the Kafka broker.
SSL TrustStore Password
Required when you use the one-way or two-way SSL mode.
Password for the SSL truststore.
SSL KeyStore File Path
Required when you use the two-way SSL mode.
Absolute path and file name of the SSL keystore file that contains private keys and certificates to connect to the Kafka broker.
SSL KeyStore Password
Required when you use the two-way SSL mode.
Password for the SSL keystore.
Additional Security Properties
Optional. Comma-separated list of additional configuration properties to connect to the Kafka broker in a secure way.
If you specify two different values for the same property in Additional Connection Properties and Additional Security Properties, the value in Additional Security Properties overrides the value in Additional Connection Properties.
This property is not used by Database Ingestion and Replication. You can specify a security protocol and properties in Additional Connection Properties.
Schema Registry Security Configuration Properties
When you configure the Schema Registry URL connection property, you can configure the schema registry security configuration properties. These properties apply only to mappings in advanced mode. You can configure one-way SSL, two-way SSL, and basic authentication to connect to the Confluent schema registry in a secure way.
The following table describes the security properties for the Kafka connection when you use the Confluent schema registry:
Property
Description
SSL Mode Schema Registry¹
Required. Determines the encryption type to use for the connection.
You can choose a mode from the following SSL modes:
- Disabled. Establishes an unencrypted connection to the Confluent schema registry.
- One-way. Establishes an encrypted connection to the Confluent schema registry using truststore file and truststore password.
- Two-way. Establishes an encrypted connection to the Confluent schema registry using truststore file, truststore password, keystore file, and keystore password.
This property is not used by Database Ingestion and Replication. You can specify an equivalent Kafka property in Additional Connection Properties.
SSL TrustStore File Path Schema Registry¹
Required when you use the one-way or two-way SSL mode.
Absolute path and file name of the SSL truststore file that contains the SSL certificate to connect to the Confluent schema registry.
SSL TrustStore Password Schema Registry¹
Required when you use the one-way or two-way SSL mode.
Password for the SSL truststore.
SSL KeyStore File Path Schema Registry¹
Required when you use the two-way SSL mode.
Absolute path and file name of the SSL keystore file that contains private keys and certificates to connect to the Confluent schema registry.
SSL KeyStore Password Schema Registry¹
Required when you use the two-way SSL mode.
Password for the SSL keystore.
Additional Security Properties Schema Registry²
Optional. Comma-separated list of additional security properties to connect to the Confluent schema registry in a secure way.
For example, when you configure basic authentication to establish a secure communication with Confluent schema registry, specify the following value:
If you specify two different values for the same property in Additional Connection Properties and Additional Security Properties Schema Registry, the value in Additional Security Properties Schema Registry overrides the value in Additional Connection Properties.
This property is not used by Database Ingestion and Replication.
¹ Applies only to mappings in advanced mode.
² Applies to both mappings and mappings in advanced mode.
Configuring the krb5.conf file to read data from or write to a Kerberised Kafka cluster
To read from or write to a Kerberised Kafka cluster, configure the default realm, KDC, and Kafka advanced source or target properties.
You can configure Kerberos authentication for a Kafka client by placing the required Kerberos configuration files on the Secure Agent machine and specifying the required JAAS configuration in the Kafka connection. The JAAS configuration defines the keytab and principal details that the Kafka broker must use to authenticate the Kafka client.
Before you read from or write to a Kerberised Kafka cluster, perform the following tasks:
1Ensure that you have the krb5.conf file for the Kerberised Kafka cluster.
2Configure the default realm and KDC. If the default /etc/krb5.conf file is not configured or you want to change the configuration, add the following lines to the /etc/krb5.conf file:
[realms] <REALM NAME> = { kdc = <Location where KDC is installed> admin_server = <Location where KDC is installed> } [domain_realm] .<domain name or hostname> = <KERBEROS DOMAIN NAME> <domain name or hostname> = <KERBEROS DOMAIN NAME>
3To pass a static JAAS configuration file into the JVM using the java.security.auth.login.config property at runtime, perform the following tasks:
For example, the JAAS configuration file can contain the following lines of configuration:
//Kafka Client Authentication. Used for client to kafka broker connection KafkaClient { com.sun.security.auth.module.Krb5LoginModule required doNotPrompt=true useKeyTab=true storeKey=true keyTab="<path to Kafka keytab file>/<Kafka keytab file name>" principal="<principal name>" client=true };
bPlace the JAAS config file and keytab file in the same location on all the secure agents.
Informatica recommends that you place the files in a location that is accessible by all the secure agents in the runtime environment. For example, /etc or /temp.
cConfigure the following properties:
Kafka connection
Configure the Additional Connection Properties in a Kafka connection and specify the value in the following format:
Configure the Consumer Configuration Properties in the advanced source properties to override the value specified in Additional Connection Properties in a Kafka connection. Specify the value in the following format:
Configure the Producer Configuration Properties in the advanced target properties to override the value specified in Additional Connection Properties in a Kafka connection. Specify the value in the following format:
Configure the Consumer Configuration Properties in the advanced source properties to override the value specified in Kerberos Configuration Properties in a Kafka connection. Specify the value in the following format:
Configure the Producer Configuration Properties in the advanced target properties to override the value specified in Kerberos Configuration Properties in a Kafka connection. Specify the value in the following format:
Configuring SASL PLAIN authentication for a Kafka cluster
In the Kafka connection, you can configure PLAIN security for the Kafka broker to connect to a Kafka broker. To read data from or write data to a Kafka broker with SASL PLAIN authentication, configure the Kafka connection properties. To override the properties defined in the Kafka connection, you can configure the advanced source or target properties.
You can configure SASL PLAIN authentication so that the Kafka broker can authenticate the Kafka producer and the Kafka consumer. Kafka uses the Java Authentication and Authorization Service (JAAS) for SASL PLAIN authentication. To enable SASL PLAIN authentication, you must specify the SASL mechanism as PLAIN. You must also provide the formatted JAAS configuration that the Kafka broker must use for authentication. The JAAS configuration defines the username, password, that the Kafka broker must use to authenticate the Kafka client.
Configure the following properties:
Kafka connection
Configure the Additional Connection Properties property in the Kafka connection and specify the value in the following format:
In the Security Configuration Section, select One-Way as the SSL Mode and specify the SSL TrustStore File Path and SSL TrustStore Password.
Sources
Configure the Consumer Configuration Properties property in the advanced source properties to override the value that you specified in the Additional Connection Properties property in the Kafka connection. Specify the value in the following format:
Configure the Producer Configuration Properties property in the advanced target properties to override the value that you specified in the Additional Connection Properties property in the Kafka connection. Specify the value in the following format:
Note: Application Ingestion and Replication and Database Ingestion and Replication do not consume entries from Additional Security Properties but only from Additional Connection Properties.
Configuring SASL PLAIN authentication for an Azure Event Hub Kafka broker
In the Kafka connection, you can configure PLAIN security for the Kafka broker to connect to an Azure Event Hub Kafka broker. When you connect to an Azure Event Hub Kakfa broker, the password defines the endpoint URL that contains the fully qualified domain name (FQDN) of the Event Hub namespace, shared access key name, and shared access key required to connect to an Azure Event Hub Kafka broker. Configure the SSL Mode as One-Way and provide the path to a trusted root certificate on your file system for SSL TrustStore File Path.
To connect to an Azure Event Hub Kafka broker, configure any of the above properties and specify the value in the following format:
Configuring SASL_SSL authentication for a Cloud Confluent Kafka cluster
In the Kafka connection, you can configure SSL security for encryption and authentication while connecting to a Kafka broker. To read data from or write data to a Confluent Kafka broker with SASL_SSL authentication, configure the Kafka connection properties. To override the properties defined in the Kafka connection, you can configure the advanced source or target properties.
For example: /root/staging/infaagent/jdk/jre/lib/security/cacerts
SSL TrustStore Password
Password for the SSL truststore.
Connecting to Amazon Managed Streaming for Apache Kafka
In the Kafka connection, you can configure PLAINTEXT or TLS encryption to connect to an Amazon Managed Streaming for Apache Kafka broker. To read data from or write data to an Amazon Managed Streaming for Apache Kafka broker, configure the Kafka connection properties.
Configure the Kafka Broker List property in the Kafka connection and specify the comma-separated list of Kafka brokers that you want to connect to in the following format:
<HostName>:<PortNumber>
Configure TLS encryption to securely connect the Kafka broker to the Kafka producer and the Kafka consumer. To configure TLS encryption for an Amazon Managed Streaming for Apache Kafka broker, configure the following properties:
Property
Values
Additional Connection Properties
security.protocol=SSL
SSL Mode
One-way or Two-way.
SSL TrustStore File Path
Required when you use the one-way or two-way SSL mode.
Absolute path and file name of the SSL truststore file.
SSL TrustStore Password
Required when you use the one-way or two-way SSL mode.
Password for the SSL truststore.
SSL KeyStore File Path
Required when you use the two-way SSL mode.
Absolute path and file name of the SSL keystore file that contains private keys and certificates that the Kafka broker validates against the Kafka cluster certificate.
SSL KeyStore Password
Required when you use the two-way SSL mode.
Password for the SSL keystore.
When you run a mapping that runs on an advanced cluster and connect to an Amazon Managed Streaming for Apache Kafka broker, configure the Kafka broker using SASL_SSL authentication with Salted Challenge Response Authentication Mechanism (SCRAM). To read data from or write data to an Amazon Managed Streaming for Apache Kafka broker with SASL_SSL authentication, configure the following properties:
Required when you use the one-way or two-way SSL mode.
Absolute path and file name of the SSL truststore file.
SSL TrustStore Password
Required when you use the one-way or two-way SSL mode.
Password for the SSL truststore.
SSL KeyStore File Path
Required when you use the two-way SSL mode.
Absolute path and file name of the SSL keystore file that contains private keys and certificates that the Kafka broker validates against the Kafka cluster certificate.