Issue | Description |
---|---|
DBMI-20627 | Application ingestion and replication jobs and database ingestion and replication jobs that have a Databricks target don't support Unity Catalog volumes to store staging files with data temporarily, before the data is written to the target, even if the Volume option is selected in the Staging Environment field in the Databricks connection properties. (April 2025) |
Issue | Description |
---|---|
DBMI-22086 | If you use the Kerberos authentication mode for any connection type and do not set the KRB5_CONFIG environment variable, the krb5.conf file might not be located and the connection to Kafka might fail. (April 2025) |
Issue | Description |
---|---|
DBMI-19405 | Tests of a Microsoft Azure Synapse Analytics Database Ingestion connection for an application ingestion and replication task or database ingestion and replication task fail with the following error if the user name specified in the connection properties includes a special character such as @ and the target database uses AAD authentication: Cannot instantiate datasource because of error: Failed to initialize pool: Cannot open server 'noconline.onmicrosoft.com' requested by the login. The login failed. ClientConnectionId: <identifier> This problem occurs because the Synapse Analytics connector package is missing some .jar files that are required to connect to the target database in this situation. Workaround: Copy the following jars to the connector package:
The package location is: package-AzureDWGen2MI.xxx\package\dw\thirdparty\informatica.azuredwgen2mi (August 2024) |
Issue | Description |
---|---|
DBMI-21412 | Application ingestion and replication jobs and database ingestion and replication jobs that have a SQL Server target might not retry a failed attempt to connect to the SQL Server database. (February 2025) |
Issue | Description |
---|---|
DBMI-23417 | An application ingestion and replication job or database ingestion and replication job that has a Snowflake target and uses the Superpipe option might fail if a column in a source table is named "TASK_NAME". (April 2025) |
DBMI-22674 | An application ingestion and replication job or database ingestion and replication job that has a Snowflake target and uses the Superpipe option might fail intermittently with the following error: Channel <LOSS_BASE_LOG> has not committed previous rows to the table, please consider restarting the job to avoid loss of data. (April 2025) |
DBMI-22255 | Application ingestion and replication jobs and database ingestion and replication jobs that perform a combined load of change data to a Snowflake target using the Superpipe option might fail to merge the most recent DML changes to the Snowflake base log table. The most recent changes would not be reflected in the base table until the next merge is executed. (April 2025) |
DBMI-21861 | For application ingestion and replication jobs and database ingestion and replication jobs that use the Superpipe option to stream data to a Snowflake target, the deployment code base does not account for the possibility that multiple tables within the same deployment might have different target states. If one table exists, Data Ingestion and Replication does not check the states of other tables. If other tables do not exist, the deployment might incorrectly handle creating those tables. (February 2025) |
DBMI-21668 | For application ingestion and replication jobs and database ingestion and replication jobs that use the Superpipe option to stream data to a Snowflake target, if a row in the source table is deleted and then added again with the same primary key, the row might not be merged correctly into the base table. In this case, the changes might not be replicated to the target. (February 2025) |
DBMI-20737 | Application ingestion and replication jobs and database ingestion and replication jobs that have the Snowflake target and use Authorization Code as the authentication method to connect to Snowflake might fail to regenerate the access token when the token expires. (February 2025) |