Connectors and Connections > Data Ingestion and Replication connection properties > SAP HANA Database Ingestion connection properties
  

SAP HANA Database Ingestion connection properties

When you set up an SAP HANA connection for a database ingestion and replication task, you must configure connection properties.
The following table describes the SAP HANA connection properties:
Connection property
Description
Connection Name
A name for the connection. This name must be unique within the organization. Connection names can contain alphanumeric characters, spaces, and the following special characters: _ . + -
Spaces at the beginning or end of the name are trimmed and are not saved as part of the name. Maximum length is 100 characters. Connection names are not case sensitive.
Description
An optional description for the connection. Maximum length is 255 characters.
Type
Select SAP HANA Database Ingestion as the connection type.
Runtime Environment
The name of the runtime environment where you want to run database ingestion and replication tasks. You define runtime environments in Administrator.
User Name
The user name for connecting to the SAP HANA instance. Enter the user name in the same case as in the database user name specified in SAP HANA.
Password
The password for connecting to the SAP HANA instance.
Host
The name of the machine that hosts the SAP HANA database server.
Port
The port number for the SAP HANA server to which you want to connect. Default is 30015.
Database Name
The SAP HANA source database name.
Advanced Connection Properties
Optional advanced properties for the SAP HANA JDBC driver, which is used to connect to the SAP HANA source. If you specify more than one property=value entry, separate them with an ampersand (&). The JDBC connection properties that you can enter in this field are described in the SAP JDBC Connection Properties documentation. For example: encrypt=true.
Capture Type
Select one of the following options to indicate the capture method that database ingestion incremental load jobs use to capture change data from SAP HANA databases:
  • - Trigger Based. Capture change data from SAP HANA source tables in the schema by using AFTER DELETE, AFTER INSERT, and AFTER UPDATE triggers. The triggers get before images and after images of DML changes for each source table and write entries for the changes to the PKLOG and shadow _CDC tables. This method is the original capture method.
  • - Log Based (Preview). Capture change data from the SAP HANA database logs. This method is available only in Preview mode. Preview functionality is supported for evaluation purposes but is unwarranted and is not supported in production environments or any environment that you plan to push to production. For more information, contact Informatica Global Customer Support.
Log Clear
Required for incremental loads. Enter the time interval, in days, after which the PKLOG table entries and shadow _CDC table entries are purged. The purging occurs only while an incremental load job is running.
Valid values for a database ingestion job are 0 to 366. Any positive value in this range cause automatic housekeeping to run while the incremental job is running. Default is 14.
A value of 0 means that the table entries are not purged. For manual housekeeping, enter 0 and use your in-house process.
Any value outside the range of 0 to 366, including a negative number or non-numeric value, causes database ingestion jobs that use the connection to fail with the following error:
LogClear contains a non numeric number. Caused by: LogClear contains a non numeric number.
Trigger Prefix
If you use the Trigger Based capture type, you can add a prefix to the names of the AFTER DELETE, AFTER INSERT, and AFTER UPDATE triggers that the CDC script generates for each source table to get before images and after images of the DML changes. Enter any prefix value up to 16 characters in length. An underscore (_) follows the prefix in the trigger name, for example, TX_SAP_DEMO_TABLE_DBMI_USER_t_d. You can use the prefix to comply with your site's trigger naming conventions.
Cache Type
If you selected the Log Based (Preview) capture type, select Hana or Oracle as the cache type.
Cache Host
If you selected the Log Based (Preview) capture type, enter the host name of the machine that hosts the cache database.
Cache Port
If you selected the Log Based (Preview) capture type, enter the port number for the cache database server.
Cache User Name
If you selected the Log Based (Preview) capture type, enter the user name to use for connecting to the cache database.
Cache Password
If you selected the Log Based (Preview) capture type, enter the password to use for connecting to the cache database.
Cache Database/Service Name
If you selected the Log Based (Preview) capture type, enter either the Hana cache database name or the Oracle cache service name, depending on the cache type you selected.
Cache Additional Connection Properties
If you selected the Log Based (Preview) capture type, you can enter a list of optional cache connection properties. If you use Hana cache, use the ampersand (&) separator. If you use Oracle cache, use the semicolon (;) separator.
Examples:
Hana: latency=0&communicationtimeout=0
Oracle: EncryptionMethod=SSL;CryptoProtocolVersion=TLSv1.1
Cache Security Connection Properties
If you selected the Log Based (Preview) capture type, you can enter a list of optional security properties for the cache connection. If you use Hana cache, use the ampersand (&) separator. If you use Oracle cache, use the semicolon (;) separator.
Examples:
Hana: encrypt=true&validateCertificate=false
Oracle: KeyStorePassword=xyz;TrustStorePassword=xy
Server Log Path
If you selected the Log Based (Preview) capture type, enter the log path for the SAP HANA DB server.
Client Log Path
If you selected the Log Based (Preview) capture type, enter the mapping of the Secure Agent machine mount path to the source database log location.
Client Archive Log Path
If you selected the Log Based (Preview) capture type, enter the mapping of the Secure Agent machine mount path to the source database archive log location.
Note: If you test the connection and the test fails, check that the SAP HANA JDBC driver file, ngdbc.jar, has been installed at Secure Agent installation directory>/ext/connectors/thirdparty/informatica.hanami.