On the job overview page, monitor the status of data quality tasks and retry tasks that have completed with errors.
After you run the job for a catalog source that has data quality enabled, you can monitor the status of the data quality task and view the results on the Overview tab for that job. The data quality result includes statistics such as the discovered objects, profiled objects, skipped objects and failed objects.
Data quality tasks create the following subtasks:
•Rule generation
•Rule profiling
•Bulk ingestion
•Task progress tracker
•Batch profiling
•Score propagation
If the data quality task completes with errors, you can retry the task. To retry a task, hover the mouse over the data quality task and click the Retry Task icon. When you retry a task, a new rule profiling subtask is created for the data quality task and data quality checks are run on all the objects.
The number of batch profiling subtasks and the maximum number of objects in each batch profiling subtask depend on the mps_maxDeploymentBatchSize property of the Metadata Platform Service configured for a Secure Agent.
You can view the logs for the objects in the Logs tab by sorting the data quality tasks by time. After the data quality task completes, you can view the results in the Results tab.
You can download logs to track failed data quality tasks. Click Failed Session Logs to download the logs. The ZIP file name represents the catalog source job ID. The catalog source job ID helps you identify the failed data quality task. You can also find the catalog source job ID from the URL of the Monitor tab as shown in the following image:
Extract the ZIP file. The extracted ZIP file contains multiple ZIP files based on the number of failed batch profiling tasks. Each ZIP file includes data quality session log files.
The following image shows the job Overview page for a data quality task: