Issue | Description |
---|---|
DBMI-20627 | Application ingestion and replication jobs and database ingestion and replication jobs that have a Databricks target don't support Unity Catalog volumes to store staging files with data temporarily, before the data is written to the target, even if the Volume option is selected in the Staging Environment field in the Databricks connection properties. Workaround: When you configure Databricks connection properties, specify an option other than Volume in the Staging Environment field. (Februrary 2025) |
Issue | Description |
---|---|
DBMI-19760 | An application ingestion and replication or a database ingestion and replication initial load job that has a Google BigQuery target might fail while uploading a staging file in Google Cloud Storage. (October 2024) |
Issue | Description |
---|---|
DBMI-19405 | Tests of a Microsoft Azure Synapse Analytics Database Ingestion connection for an application ingestion and replication task or database ingestion and replication task fail with the following error if the user name specified in the connection properties includes a special character such as @ and the target database uses AAD authentication: Cannot instantiate datasource because of error: Failed to initialize pool: Cannot open server 'noconline.onmicrosoft.com' requested by the login. The login failed. ClientConnectionId: <identifier> This problem occurs because the Synapse Analytics connector package is missing some .jar files that are required to connect to the target database in this situation. Workaround: Copy the following jars to the connector package:
The package location is: package-AzureDWGen2MI.xxx\package\dw\thirdparty\informatica.azuredwgen2mi (August 2024) |
Issue | Description |
---|---|
DBMI-21412 | Application ingestion and replication jobs and database ingestion and replication jobs that have a SQL Server target might not retry a failed attempt to connect to the SQL Server database. (February 2025) |
DBMI-19742 | Application ingestion and replication jobs and database ingestion and replication jobs that have a SQL Server target write empty strings from the source as nulls on the target. This behavior can cause problems with downstream processing of the target data. Instead, write each source empty string as a space on the target to prevent these problems. (October 2024) |
Issue | Description |
---|---|
DBMI-21861 | For application ingestion and replication jobs and database ingestion and replication jobs that use the Superpipe option to stream data to a Snowflake target, the deployment code base does not account for the possibility that multiple tables within the same deployment might have different target states. If one table exists, Data Ingestion and Replication does not check the states of other tables. If other tables do not exist, the deployment might incorrectly handle creating those tables. (February 2025) |
DBMI-21668 | For application ingestion and replication jobs and database ingestion and replication jobs that use the Superpipe option to stream data to a Snowflake target, if a row in the source table is deleted and then added again with the same primary key, the row might not be merged correctly into the base table and then the changes might not be replicated to the target. (February 2025) |
DBMI-20737 | Application ingestion and replication and database ingestion and replication jobs that have the Snowflake target and use the Authorization Code as the authentication method to connect to Snowflake might fail to regenerate access token when the token expires. (February 2025) |