Property | Description |
---|---|
Task Name | Name of the task. |
Instance ID | Instance number for the task. For example, if you are looking at the third run of the task, this field displays "3." |
Task Type | Task type, in this case, code task. |
CodeTask ID | Code task unique identifier. |
Started By | Name of the user that started the job. |
Start Time | Date and time that the job was started. |
End Time | Date and time that the job completed or stopped. |
Duration | Amount of time the job ran before it completed or was stopped. |
Runtime Environment | Runtime environment in which the job ran. |
Advanced configuration | Advanced configuration that was used to create the advanced cluster. |
Cluster | Advanced cluster where the advanced job runs. You can click the cluster name to navigate directly to the monitoring details for the cluster. |
Property | Description |
---|---|
Status | Job status. A job can have one of the following statuses:
|
Session Log | Allows you to download the session log file. By default, Informatica Intelligent Cloud Services stores session logs for 10 runs before it overwrites the logs with the latest runs. Session log files are written to the following directory: <Secure Agent installation directory>/apps/Data_Integration_Server/logs |
Requested Compute Units Per Hour | Number of serverless compute units per hour that the task requested. You can view the number of requested compute units if the task runs in a serverless runtime environment. |
Total Consumed Compute Units | Total number of serverless compute units that the task consumed. You can view the number of consumed compute units if the task runs in a serverless runtime environment. |
Error Message | Error message, if any, that is associated with the job. |
Property | Required / Optional | Description |
---|---|---|
Override Code Task Timeout | Optional | Overrides the code task timeout value for this execution. A value of -1 signifies no timeout. |
Log Level | Optional | Log level for session logs, agent job log, Spark driver, and executor logs. Valid values are: none, terse, normal, verboseInitialization, or verboseData. The default value is normal. |
Property | Required / Optional | Description |
---|---|---|
Main Class | Required | Entry point of the Spark application. For example: org.apache.spark.examples.company.SparkExampleApp |
Main Class Arguments | Optional | Ordered arguments sent to the Spark application main class. For example: --appTypeSPARK_PI_FILES_JARS-- classesToLoadcom.company.test.SparkTest1Class |
Primary Resource | Required | Scala JAR file that contains the code task. |
JAR File Path | Optional | The directory and file name of the JAR file that is uploaded to the cluster and added to the Spark driver and executor classpaths. |
Spark File Path | Optional | The directory and file name of the Spark file that is uploaded to the cluster and available under the current working directory. |
Custom Properties | Optional | Spark properties or other custom properties that Data Integration uses. |
Property | Description |
---|---|
Status | Status of the Spark task. The Spark task can have one of the following statuses:
If the Secure Agent fails while the job is running, the status of the Spark tasks continues to display Running. You must cancel the job and run the job again. |
Start time | Date and time when the Spark task started. |
End time | Date and time when the Spark task ended. |
Duration | Amount of time that the Spark task ran. |
Memory Per Executor | Amount of memory that each Spark executor uses. |
Cores Per Executor | Number of cores that each Spark executor uses. |
Driver and Agent Job Logs | Select Download to download the Spark driver and agent job logs. |
Advanced Log Location | The log location that is configured in the advanced configuration for the advanced cluster. You can navigate to the advanced log location to view and download the agent job log, Spark driver log, and Spark executor logs. |
Error Message | Error message, if any, that is associated with the job. |
Property | Description |
---|---|
Job Name | Name of the Spark job or stage. |
Duration | Amount of time that the Spark job or stage ran. |
Total Tasks | Number of tasks the Spark job or stage attempted. |
Failed Tasks | Number of tasks that the Spark job or stage failed to complete. |
Input Size / Records | Size of the file and number of records input by the Spark job or stage. |
Output Size / Records | Size of the file and number of records output by the Spark job or stage. |
Status | Status of the Spark job or stage. The status can be one of the following values:
Note: After you abort a code task, there might be some lag time before the Monitor service shows the status as Aborted. |