Issue | Description |
---|---|
SCAN-17398 | Metadata extraction jobs don't parse the JavaScript function and the following error occurs: com.compactsolutionsllc.cdimc.common.visitor.tracking.TrackedException: Parse error (NoViableAlt) on token: '.' . Line.column: 9.4, Text: [ >.<Array.from] |
Issue | Description |
---|---|
SCAN-7689 | When you run a metadata extraction job on a Snowflake catalog source that includes Stored Procedure scripts with complex references as values, the job fails with the following error: Failed to analyze statement. Ambiguous suffix in segment map |
Issue | Description |
---|---|
MDX-40538 | A Snowflake catalog source job generates the following parse error if a WITH statement in an SQL query includes an expression: Parse error (NoViableAlt) on token: 'NVL' |
MDX-40817 | When you run a Snowflake catalog source job and extract a view created with the LEVEL function, Data Governance and Catalog doesn't display the lineage for this view and shows an Unknown Column error. |
Issue | Description |
---|---|
CDGC-73163 | After you upgrade to the November 2024 release, if the first job you run on existing catalog sources is an incremental metadata extraction job, the job fails with the following error: Running Incremental scan is not possible without last successful scan date provided. Workaround: Run a full metadata extraction job first, and then run the incremental metadata extraction job. |
SCAN-13820 | When you import a session from a Snowflake.Snowpark library into a stored procedure, the metadata extraction job doesn't extract statement assets. The following error appears in the scanner debug log file: ERROR (ScannerComponent - PythonImportStatementVisitor:58) {<fqn of the stored procedure>} - Unable to import Python custom module snowflake.snowpark.Session. The module will be treated as black-box: we'll keep the lineage in places when this module is used but we'll not provide and specific behavior of this module. If you have WHL file with this module source code then you can put in the scanners Python user modules directory and run the scanning again. If unsure you can raise an issue for the Scanners team to investigate if there are other options here. Workaround: Don't use the from snowflake.snowpark import Session line in the code. Specify the Snowpark session object as the first argument in the method or function. When you run a stored procedure, Snowflake automatically creates a session object and passes it to the stored procedure. |
SCAN-13841 | If a stored procedure uses the snowflake.snowpark.Session.write_pandas()function to write pandas DataFrame to a table in the source system, the metadata extraction job does not extract statement assets. |
SCAN-13840 | If a stored procedure uses the DataFrame.to_snowpark_pandas() function to convert Snowpark pandas DataFrame or Series into Snowpark DataFrame, the metadata extraction job doesn't extract statement assets. The following error appears in the scanner debug log file: ERROR (ScannerComponent - PythonExpressionResolver:58) {<fqn of the procedure>} - Failed to resolve expression modin.pandas.to_snowpark()@21.32:21.63 |
SCAN-13846 | If a stored procedure uses a parallel class from a joblib library to import a joblib, the metadata extraction job doesn't extract statement assets. The following error appears in the scanner debug log file: ERROR (ScannerComponent - PythonImportStatementVisitor:58) {<fqn of the procedure>} - Unable to import Python custom module joblib. The module will be treated as black-box: we'll keep the lineage in places when this module is used but we'll not provide and specific behavior of this module. If you have WHL file with this module source code then you can put in the scanners Python user modules directory and run the scanning again. If unsure you can raise an issue for the Scanners team to investigate if there are other options here. |
Issue | Description |
---|---|
MDP-4165 | When you run a writeback job on a Snowflake source system with an OAuth-based connection, the job fails. This issue occurs when the access token for the connection expires after 600 seconds and doesn't get refreshed. Workaround: Generate a new access token for the Snowflake connection in Administrator. |