Data Ingestion and Replication Release Notes > Common > Connector and connectivity issues
  

Connector and connectivity issues

Read the following pages to learn about fixed issues, known limitations, and third-party limitations that apply to connectors that you can use in your service. Consider the following guidelines for the release notes:

Databricks Connector

Fixed issues

The following table describes fixed issues:
Issue
Description
DBMI-20627
Application ingestion and replication jobs and database ingestion and replication jobs that have a Databricks target don't support Unity Catalog volumes to store staging files with data temporarily, before the data is written to the target, even if the Volume option is selected in the Staging Environment field in the Databricks connection properties.
(April 2025)

Kafka Connector

Fixed issues

The following table describes fixed issues:
Issue
Description
DBMI-22086
If you use the Kerberos authentication mode for any connection type and do not set the KRB5_CONFIG environment variable, the krb5.conf file might not be located and the connection to Kafka might fail.
(April 2025)

Microsoft Azure Synapse Analytics Database Ingestion Connector

Known issues

The following table describes known issues:
Issue
Description
DBMI-19405
Tests of a Microsoft Azure Synapse Analytics Database Ingestion connection for an application ingestion and replication task or database ingestion and replication task fail with the following error if the user name specified in the connection properties includes a special character such as @ and the target database uses AAD authentication:
Cannot instantiate datasource because of error: Failed to initialize pool: Cannot open server 'noconline.onmicrosoft.com' requested by the login. The login failed. ClientConnectionId: <identifier>
This problem occurs because the Synapse Analytics connector package is missing some .jar files that are required to connect to the target database in this situation.
Workaround: Copy the following jars to the connector package:
  • - activation-1.1
  • - adal4j-1.4.0
  • - jcip-annotations-1.0-1
  • - json-smart-1.3.1
  • - mail-1.4.7
  • - nimbus-jose-jwt-9.40
  • - oauth2-oidc-sdk-5.24.1
The package location is: package-AzureDWGen2MI.xxx\package\dw\thirdparty\informatica.azuredwgen2mi
(August 2024)

Microsoft SQL Server Connector

Fixed issues

The following table describes fixed issues:
Issue
Description
DBMI-21412
Application ingestion and replication jobs and database ingestion and replication jobs that have a SQL Server target might not retry a failed attempt to connect to the SQL Server database.
(February 2025)

Snowflake Data Cloud Connector

Fixed issues

The following table describes fixed issues:
Issue
Description
DBMI-23417
An application ingestion and replication job or database ingestion and replication job that has a Snowflake target and uses the Superpipe option might fail if a column in a source table is named "TASK_NAME".
(April 2025)
DBMI-22674
An application ingestion and replication job or database ingestion and replication job that has a Snowflake target and uses the Superpipe option might fail intermittently with the following error:
Channel <LOSS_BASE_LOG> has not committed previous rows to the table, please consider restarting the job to avoid loss of data.
(April 2025)
DBMI-22255
Application ingestion and replication jobs and database ingestion and replication jobs that perform a combined load of change data to a Snowflake target using the Superpipe option might fail to merge the most recent DML changes to the Snowflake base log table. The most recent changes would not be reflected in the base table until the next merge is executed.
(April 2025)
DBMI-21861
For application ingestion and replication jobs and database ingestion and replication jobs that use the Superpipe option to stream data to a Snowflake target, the deployment code base does not account for the possibility that multiple tables within the same deployment might have different target states. If one table exists, Data Ingestion and Replication does not check the states of other tables. If other tables do not exist, the deployment might incorrectly handle creating those tables.
(February 2025)
DBMI-21668
For application ingestion and replication jobs and database ingestion and replication jobs that use the Superpipe option to stream data to a Snowflake target, if a row in the source table is deleted and then added again with the same primary key, the row might not be merged correctly into the base table. In this case, the changes might not be replicated to the target.
(February 2025)
DBMI-20737
Application ingestion and replication jobs and database ingestion and replication jobs that have the Snowflake target and use Authorization Code as the authentication method to connect to Snowflake might fail to regenerate the access token when the token expires.
(February 2025)