Data profiling tasks don't include all files if you enable partition detection for Amazon S3, Microsoft Azure Data Lake Storage Gen2, and Google Cloud Storage catalog sources with Parquet files.
This issue occurs if a folder contains partitioned Parquet files with incremental schema.
To resolve the issue, disable partition detection and run the data profiling task.
A profile run fails with the following error message for Microsoft Azure Synapse source objects:
Workflow generation failed for object: <DatabaseName/SchemaName/ObjectName> with error: Invalid datatype: int
This issue occurs when the source object includes data types such as Int, datetimeoffset, or uniqueidentifier.
To resolve the issue, rerun the failed task. Click Retry Task on the Job Details page to rerun the failed data profiling task.
A profile run fails with the following error message for Amazon S3, Google Cloud Storage, or Azure Data Lake Storage Gen2 source objects:
"Failure w.r.t "java.lang.RuntimeException: No active Metadata_Platform_Service found for "Secure Agent ID" for mentioned combinations in description"
This issue occurs when the connection that you use to create a catalog source includes an inactive Secure Agent.
To resolve the issue, use an active Secure Agent when you create the connection or add an active Secure Agent in the elastic runtime environment field when you configure the profiling task.
A profile run for SAP S/4 HANA source objects fails with the following error message:
"ERROR: "OPTION_NOT_VALID: OPTION_NOT_VALID Message 000 of class SAIS type E"
This issue occurs when you use an SAP Table connection from one of the following versions:
•S/4 HANA version 2021
•SAP ECC version 6.0 EHP8
•SAP NetWeaver system version 7.40 SP26
To resolve this issue, perform the following steps:
1In the Informatica Intelligent Cloud Services Administrator, open the Runtime Environments page.
2On the Runtime Environments page, select the Secure Agent associated with the SAP Table connection.
3Click Edit.
4In the Custom Configuration Details area, select Data Integration Server as the service and Tomcat as the type.
5Enter SapStrictSql in the Name field and set the value based on the SAP system language as shown in the following image:
6In Metadata Command Center, run the catalog source job again.
Data profiling tasks don't include columns of Date/Time date type in the results.
The Date/Time data type handles years from 1 A.D. to 9999 A.D. in the Gregorian calendar system. Years beyond 9999 A.D. cause an error. The Date/Time data type has a precision of 29 and a scale of 9. To resolve this issue, check the precision value of the column in the source system. If it is greater than 29, reduce the precision value.
A profile run fails with the error "***ERROR: nsort_release_recs() returns -10".
To resolve this issue, increase the disk space storage of the hard drive where the Secure Agent is installed.
Data profiling task for some catalog sources fails with the error "Internal error. The DTM process terminated unexpectedly."
This happens if the length of the column name in a table exceeds the maximum column length allowed for each source system. The following table lists the maximum column length allowed for the source systems that this issue impacts:
Source System
Maximum Column Length
- Amazon Redshift
- Amazon S3
- Google BigQuery
- Google Cloud Storage
- JDBC (IBM DB2)
- Microsoft Azure Data Lake Storage Gen2
- Microsoft Azure SQL Server
- Microsoft Azure Synapse
- Microsoft SQL Server
- Oracle
- Snowflake
73 characters
JDBC (PostGreSQL)
63 characters
JDBC (MySQL)
64 characters
For the profiling task to complete successfully, rename the column so that length has the maximum characters allowed or fewer.
Data profiling task for some catalog sources fails if the user is assigned only the Governance Administrator role.
This issue impacts the following source systems:
•Amazon S3
•Google BigQuery
•Google Cloud Storage
•Microsoft Azure Data Lake Storage Gen2
•Amazon Redshift
•Snowflake
For the data profiling task to complete successfully, assign the Designer role to the user.
The data profiling task for a Snowflake catalog source fails with the following exception:
"SEVERE: Exception creating result java.lang.ExceptionInInitializerError at sun.misc.Unsafe.ensureClassInitialized(Native Method)"
To resolve this issue, perform the following steps:
1In the Informatica Intelligent Cloud Services Administrator, go to Runtime Environments.
2Select the Secure Agent that runs the data profiling task, and click Edit.
3From the Service menu, select Data Integration Server and set the following values for these parameters:
- JVMOption1: "-Xms1024m"
- JVMOption2: "-Xmx4096m"
4Click Save.
5Rerun the data profiling for the Snowflake catalog source.
Data profiling task for a catalog source may fail with the error, 'Mapping execution failed for object: <table_name>: EP_13236 Could not open the following dll: [.//libpmdpaggregate.so]'
This happens if the Data Quality service is not enabled for the Secure Agent on which you are running the data profiling task.
To resolve this issue, perform the following steps:
1In the Informatica Intelligent Cloud Services Administrator, go to Runtime Environments.
2Click the Actions menu for the Secure Agent that runs the data profiling task.
3From the menu, select Enable or Disable Services, Connectors.
4Select the Data Quality service from the list of services, and click OK.