When you add a step to a taskflow, you set properties associated with each step.
Note: When you set a value for a field in any taskflow step, you can't enter curly brackets {}.
Assignment step
When you add an Assignment step, you can configure the following properties:
Name
The name of the Assignment step. The name can contain only alphanumeric characters, underscores (_), spaces, and Unicode characters. The name can't contain curly brackets {}.
You select a field name from the list of fields you defined under Start > Fields.
Assignments
The source from which the field takes values. The fields that you see depend on the data type you defined for the Start > Properties > Input Fields or Temporary Fields.
For example, you define the data type in the Input Fields as Date Time. The following Value options appear for that field in the Assignment step:
- Specific date
- Time from now
- Days from today
- Days before/after
- Field
- Formula
You can override runtime parameters in the Assignment step.
When you add a Data Task step, you set some properties.
The following sections describes the Data Task step properties:
General
In the general properties, you can specify a descriptive name for the Data Task step.
The name can contain only alphanumeric characters, underscores (_), spaces, and Unicode characters. The name can't contain curly brackets {}, +, dot (.), or comma (,).
Data Task
In the Data Task step properties, select the task from a list of existing tasks that you want to add to the taskflow.
Note: You must have an existing task to add to a taskflow. You cannot create a task during the taskflow creation process.
When you add a mapping task to a Data Task step, you see a description, input fields, and output fields. The input fields show the in-out parameters that the mapping task uses.
The output fields show the output fields that the mapping task returns after the taskflow runs.
When you click the Add icon on the Data Task step, you see one of the following views:
•If the Data Task step contains a task, a non-editable view of the task opens.
•If the Data Task step does not contain a task, you see a dialog box from which you can choose a task.
When you add a task to a Data Task step, a corresponding taskflow temporary field of type Text is created. When you add a task to the Data Task step, the temporary field type is the name of the task. See Temporary fields for details.
Input Fields
The Input Fields section appears when you add a task to the taskflow.
In the Max Wait (Seconds) field, you can configure the maximum length of time in seconds that the Data Task step waits for the data integration task to complete. Specify a value between 1 and 604800 seconds. Default is 604800 seconds, which is 7 days. If the task is not completed within the maximum time specified in the field, the task stops running and the subsequent task in the taskflow starts running.
Note: If the specified value is lesser than 1 or greater than 604800, the maximum wait time is automatically set to 604800 seconds.
If the task contains parameters that you can override, you can add input fields. You can set properties for an input field to override Data Integration runtime parameters. For information about runtime parameters, see Overriding parameters or parameter files in a Data Task step.
If the Data Task step uses a PowerCenter task or mapping task, you can override the values of input parameters and in-out parameters of the task.
If the Data Task step uses a mapping task, you can perform the following override actions:
•If the mapping task contains a parameter file available on the Secure Agent machine, you can override the parameter file directory and parameter file name.
•If the mapping task contains a parameter file available in a cloud-hosted repository, you can override the parameter file connection and parameter file object. Data Integration supports only the Amazon S3 V2, Azure Data Lake Store Gen2, and Google Storage V2 connection types for mapping tasks.
•If the mapping task uses data formatting options, you can override the data formatting and default precision values of the source data. These options are available in the input fields only if the formatting file is uploaded to the mapping task and not to the mapping. The precision value set in the default precision field takes precedence over the precision set in the data format field or the mapping task. The default precision value is applied to all the columns in the formatting file.
•If the mapping task contains a Lookup transformation, you can override the values of the lookup object and lookup condition.
Note: You cannot override the value of an input parameter of type string or text from the parameter file. However, you can override the input parameter value from the taskflow. You can override the connection parameter values from the parameter file.
If the Data Task step uses a dynamic mapping task, you can add an input parameter named Job Filter. You cannot edit the name of the input field. However, you can specify the groups and jobs from the dynamic mapping task that you want to run in a taskflow.
To specify the groups and jobs, click Edit, and then enter the value as <group_name>.<job_name> for the input field with the Content type. For example, if you want to run Group_1 and Job_1 from the dynamic mapping task, enter the value as Group_1.Job_1 in the Job Filter input field.
If you do not add the Job Filter input field, by default, the taskflow runs all the jobs available in the dynamic mapping task in the specified order.
Output Fields
The Output Fields section appears after you add a synchronization task or PowerCenter task to the taskflow.The Output Fields section is an exhaustive list of output data fields that appear when the task runs.
The following image shows the Output fields you see:
If the mapping task runs on an advanced cluster, you see the following fields:
Note: For a mapping task that runs on an advanced cluster, success source rows and failed source rows are not populated when the task runs.
If you use a data transfer taskor a dynamic mapping task, you see the following fields:
To view the values of each output field, run the taskflow and go to the Taskflow Instance Detail page. For more information about the Taskflow Instance Detail page, see the Monitor help.
You can use output fields in a Data Decision or Assignment step.
For example, create a temporary field with value Formula and use the following expression to assign data to the field:
if( ($temp.DataTask1[1]/output[1]/Failed_Target_Rows < 0 or $temp.DataTask1[1]/output[1]/Task_Status = '1') and ($temp.DataTask2[1]/output[1]/Success_Target_Rows > 0 and $temp.DataTask2[1]/output[1]/Failed_Target_Rows = 0) and $temp.DataTask3[1]/output[1]/Success_Target_Rows > 0) then 'Pass' else 'Fail'
When you use the temporary field in a Decision step, the taskflow takes the Pass path if the following conditions are met:
•Data Task 1 has no failed target rows or Data Task 1 runs successfully.
•Data Task 2 has at least one successful target row.
•Data Task 2 has zero failed target rows.
•Data Task 3 has at least one successful target row.
Timer Events
Enter the following Events properties to add timers to a task:
Use a Timer event to perform an action based on a schedule. The action could be either at a specific time or after an interval.
When you add a timer to a Data Task step, a new branch appears. Add an event to this branch and specify whether you want the event to run At a specific time or After an interval.
In the following image, the event on the timer branch, a Data Decision step, occurs five minutes after the main data task:
When a timer fires, the taskflow always runs through the entire timer branch. If Data Task 1 finishes before Decision 1, the timer branch is not executed.
Select Interrupting if you want the timer to interrupt the main data task. When you set an interrupting timer, the main data task is interrupted and the taskflow only runs the event on the timer set.
The following image shows an interrupting timer set to occur five minutes after the main data task starts:
When the event on the timer branch, Data Task 2, executes, Data Task 1 is interrupted. The taskflow follows the timer branch. That is, the taskflow runs Data Task 2 and then ends.
If you delete the End step on the timer branch of an interrupting timer, the timer branch rejoins the main branch.
The following image shows an interrupting timer branch with the End step deleted:
The timer event, Data Task 2, executes after 5 minutes and interrupts Data Task 1. The timer branch rejoins the main branch. The taskflow executes Data Task 2, a Parallel Paths step, and then ends.
If you use an interrupting timer, the main data task has no output with respect to this taskflow instance. You see no output fields for the main data task in the job details for the taskflow.
If a Data Task step completes before a timer, interrupting or non interrupting, fires no timer will fire for that Data Task step.
Note: When you run a particular step in a timer branch of a taskflow instance, the steps in the alternate branch also get executed. To avoid this issue, add a dummy step after the step that you would like to run.
Error Handling
Use the Error Handling section to indicate how you want the taskflow to behave when a Data Task step encounters a warning or an error. You can also configure the taskflow behavior when the task associated with a Data Task step fails or does not run.
After you select a task, enter the following error handling properties:
Property
Description
On Warning
The path that a taskflow takes when it encounters a warning in a Data Task step.
A warning occurs when a Data Task step completes incorrectly or incompletely. For example, you see a warning if the Data Task step copies only 20 out of 25 rows from table A to table B.
You can choose from the following options:
- Select Ignore to ignore the warning and move to the next step.
Note: If you select Ignore for a Data Task step with a subsequent Notification Task step and the data task fails, the email notification that you receive does not contain the fault details. To get the fault details in the email, select Custom error handling.
- Select Suspend Taskflow to move the taskflow to the suspended state when it encounters a warning. You can resume the taskflow instance from the All Jobs, Running Jobs, or My Jobs page.
The taskflow resumes from the step at which it was suspended. If you know the reason for the warning, correct the issue and then resume the taskflow.
Default: Ignore
On Error
The path that a taskflow takes when it encounters an error in a Data Task step.
An error occurs when a Data Task step fails. For example, you see an error if the Data Task does not copy table A to table B.
You can choose from the following options:
- Select Ignore to ignore the error and move to the next step.
- Select Suspend Taskflow to move the taskflow to the suspended state when it encounters an error. You can resume the taskflow instance from the All Jobs, Running Jobs, or My Jobs page.
The taskflow resumes from the step at which it was suspended. If you know the reason for the error, correct the issue and then resume the taskflow.
- Select Custom error handling to handle the error in a manner you choose. If you select Custom error handling, two branches appear. The first branch is the path the taskflow follows if no error occurs. The second branch is the custom path the taskflow follows if an error occurs.
Default: Suspend Taskflow
Fail taskflow on completion
The taskflow behavior when the task associated with the Data Task step fails or does not run.
You can configure a taskflow to fail on its completion if the task associated with the Data Task step fails or does not run. If the task fails or does not run, the taskflow continues running the subsequent steps. However, after the taskflow completes, the taskflow status is set to failed.
Note: If you configure both the Suspend on Fault taskflow advanced property and the Fail taskflow on completion property, the Suspend on Fault property takes precedence. In this case, if the task associated with the Data Task step fails or does not run, the taskflow is suspended. The taskflow does not run the subsequent steps after the Data Task step.
The following image shows a Custom error handling path with an Assignment step and another Data Task step:
IntegrationOps Task step
Use the IntegrationOps Task step to run a published Application Integration process from a taskflow. In the IntegrationOps Task step, you can select an existing published Application Integration process.
When you use the IntegrationOps Task step to run an Application Integration process from a taskflow, consider the following limitations:
•The IntegrationOps Task step uses the authentication configured for the taskflow. It ignores the basic authentication configured for the process.
•You cannot use custom process objects for the process input. Instead, you can pass the entire XML as input for the custom process objects.
•If the IntegrationOps Task step fails, the taskflow continues to run the subsequent steps.
•An IntegrationOps Task step might contain output fields with simple types that are not supported by taskflows. Only fields with simple types that are supported by taskflows are available for orchestration in the subsequent steps.
•If the Application Integration process used in the IntegrationOps Task step is modified or the Secure Agent in the process is changed, you must reselect the process and publish the taskflow again. Otherwise, the IntegrationOps Task step fails at run time.
•When you add a process to a taskflow, you can't pass input values of the picklist and multi-select picklist data types.
The following sections describe the IntegrationOps Task step properties:
General
In the general properties, you can specify a descriptive name for the IntegrationOps Task step.
The name can contain only alphanumeric characters, underscores (_), spaces, and Unicode characters. The name can't contain curly brackets {}.
IntegrationOps Task
In the IntegrationOps Task step properties, select CAI Process in the Task Executor field. In the Task field, select the process that you want to add to the taskflow. The API Name field populates the API name of the Application Integration process.
Note: You must have an existing published process to add to a taskflow.
The Select IntegrationOps Task dialog box lists all the published Application Integration processes. Ensure that you select a process with the binding set to REST/SOAP.
When you add a process to an IntegrationOps Task step, you see a description, the input fields that the process uses, and the output fields configured in the process. You can use the input fields and output fields to orchestrate subsequent tasks in the taskflow.
Note: When you open an imported taskflow asset that includes multiple processes, the Validation panel might list errors for IntegrationOps task steps that contain unpublished processes. The errors disappear after all the processes are published.
Input fields
If the IntegrationOps Task step uses a process that contains input fields, you can override the values of the input fields.
IntegrationOps Task step scenarios
Consider the following scenarios when you use an Application Integration process in the IntegrationOps Task step:
•If the process is suspended, the taskflow remains in the Running state. However, the taskflow instance resumes after 24 hours.
•If the process contains input fields of the list of simple type, you cannot assign a value to the input field using the Field option. However, you can use the Formula option to assign the value.
•If the input field or output field name contains a space, the IntegrationOps Task step fails.
Notification Task step
A Notification Task step sends a notification to specified recipients.
You can configure the Notification Task step to send an email notification. For example, you can send an email notification to inform recipients about the number of success rows and error rows that were encountered in a Data Task step of a taskflow.
You can define properties for the Notification Task step to configure the email recipients and email content. You cannot use distribution lists in the Email To, Email Cc, and Email Bcc fields.
The following sections describe the Notification Task step properties:
General properties
In the general properties, you can specify a descriptive name for the Notification Task step.
The name can contain only alphanumeric characters, underscores (_), spaces, and Unicode characters. The name can't contain curly brackets {}.
Notification Task properties
You can configure the following Notification Task step properties:
Notification Method
The type of notification to be sent.
The value of this field is set to Email by default. You cannot edit this value. The other values are reserved for future use.
Email To
Required. The primary recipients for the email notification.
Use one of the following options to specify the value for this field:
- Content. Enter one or more valid recipient email addresses. You can also add fields that contain valid email addresses. Separate email addresses and fields with a comma (,) or a semicolon (;).
- Field. Select the field that the Taskflow Designer uses to write the email addresses into this field when this step executes. You can select an input field or a temporary field that was added in any other step in the taskflow.
- Formula. Open the formula editor to specify a formula that calculates the value for this field.
Default is Content.
Email Cc
The recipients who need to be sent a copy of the email notification.
Use one of the following options to specify the value for this field:
- Content. Enter one or more valid recipient email addresses. You can also add fields that contain valid email addresses. Separate email addresses and fields with a comma (,) or a semicolon (;).
- Field. Select the field that the Taskflow Designer uses to write the email addresses into this field when this step executes. You can select an input field or a temporary field that was added in any other step in the taskflow.
- Formula. Open the formula editor to specify a formula that calculates the value for this field.
Default is Content.
Email Bcc
The additional recipients who need to be sent a copy of the email notification. The recipients in the Email To and Email Cc fields will not be able to see the recipients in the Email Bcc field. If the field contains more than one recipient, the recipients will not be able to see the other recipients in the Email Bcc field.
Use one of the following options to specify the value for this field:
- Content. Enter one or more valid recipient email addresses. You can also add fields that contain valid email addresses. Separate email addresses and fields with a comma (,) or a semicolon (;).
- Field. Select the field that the Taskflow Designer uses to write the email addresses into this field when this step executes. You can select an input field or a temporary field that was added in any other step in the taskflow.
- Formula. Open the formula editor to specify a formula that calculates the value for this field.
Default is Content.
Email Subject
A short and descriptive subject that introduces the email.
Use one of the following options to specify the value for this field:
- Content. Enter a subject for the email.
- Field. Select the field that the Taskflow Designer uses to write the email subject into this field when this step executes. You can select an input field or a temporary field that was added in any other step in the taskflow.
- Formula. Open the formula editor to specify a formula that calculates the value for this field.
Default is Content.
Email Content Type
The type of formatting that you want to use for the email content.
Select one of the following values:
- HTML. Select HTML to use formatting options such as bold, italics, underlines, lists, indentations, and fonts. You can also insert tables and links.
- Plain Text. Select Plain Text to add regular text without any formatting and special layout option.
Default is Plain Text.
Email Body
The content that you want to send in the email.
Use one of the following options to specify the value for this field:
- Content. Enter content for the email body.
You can click Edit Content to open a rich text editor for formatting the content.
The email content appears within HTML tags on the All Jobs, Running Jobs, and My Jobs pages.
- Field. Select the field that the Taskflow Designer uses to write the email body into this field when this step executes. You can select an input field or a temporary field that was added in any other step in the taskflow.
- Formula. Open the formula editor to specify a formula that calculates the value for this field.
Default is Content.
Email notification examples
You can define the content of the email body based on the number of data tasks in the taskflow to send an email notification with HTML content.
Single data task
When you use the Notification Task step for a single data task, you can enter the data task details in the email body with the source type as content.
For example, you might pass the following XQuery expression:
Hi, Data task with Run Id { $temp.DataTask1[1]/output[1]/Run_Id } started at { $temp.DataTask1[1]/output[1]/Start_Time } and completed at { $temp.DataTask1[1]/output[1]/End_Time } with a status of { $temp.DataTask1[1]/output[1]/Task_Status }
In the above example, if the data task runs successfully, you will receive the output details in an email notification.
In case the data task fails, you can use the fault fields in the XQuery expression to determine the reason for failure as shown in the following example:
Hi, Data task with Run Id { $temp.DataTask1[1]/fault[1]/detail[1]/errOutputDetail[1]/Run_Id } started at { $temp.DataTask1[1]/fault[1]/detail[1]/errOutputDetail[1]/Start_Time } and completed at { $temp.DataTask1[1]/fault[1]/detail[1]/errOutputDetail[1]/End_Time } with a status of { $temp.DataTask1[1]/fault[1]/detail[1]/errOutputDetail[1]/Task_Status }
Multiple data tasks
When you use the Notification Task step to send an email summary for multiple data tasks, you can use an XQuery expression with the source type as formula.
For example, you might pass the following XQuery expression for a taskflow that contains two data tasks named <DataTask1> and <DataTask2>:
In the above example, if the data task runs successfully, you will receive the output details in an email notification.
In case of failed data tasks, you can use the fault fields in the XQuery expression to determine the reason for failure as shown in the following example:
Before sending an email notification, you might want to convert the timezone based on the recipient's location. To do this, you can use the infa:format XQuery function in the email subject or email body. For more information about converting the timezone using a formula, see the following community article:
Certain restrictions apply when you use the Content option to specify the email body and the HTML option to specify the email content type in a Notification Task step.
Consider the following guidelines:
•If you use HTML content, you may not receive a valid formatted email. You must use a variable to define the HTML content and serialize the variable in the Expression Editor.
For example, if the HTML content is as follows:
<html> Order {$output.OrderID} has been submitted for {$input.CustomerEmail}. <br/> <b>Order details for your records.</b> <br/><br/> Item Cost: {$temp.InventoryDetails[1]/ItemCostPrice} Item Count: {$temp.InventoryDetails[1]/ItemCount } Item Sell Price: {$temp.InventoryDetails[1]/ItemSellingPrice } Commission Percentage: {$temp.InventoryDetails[1]/SalesCommissionInPercentage } <br/><br/> <b>Margins</b> Overall Profit: {$output.Calculate_Margin_ServiceResponse[1]/MarginBeforeCommission} Sales Commission: {$output.Calculate_Margin_ServiceResponse[1]/SalesCommission} Profit after Commission: {$output.Calculate_Margin_ServiceResponse[1]/MarginAfterCommission} </html>
Use the following content in the Expression Editor to define a variable for the HTML content and serialize the variable to receive a valid formatted email:
let $doc := <html> Order {$output.OrderID} has been submitted for {$input.CustomerEmail}. <br/> <b>Order details for your records.</b> <br/><br/> Item Cost: {$temp.InventoryDetails[1]/ItemCostPrice} Item Count: {$temp.InventoryDetails[1]/ItemCount } Item Sell Price: {$temp.InventoryDetails[1]/ItemSellingPrice } Commission Percentage: {$temp.InventoryDetails[1]/SalesCommissionInPercentage } <br/><br/> <b>Margins</b> Overall Profit: {$output.Calculate_Margin_ServiceResponse[1]/MarginBeforeCommission} Sales Commission: {$output.Calculate_Margin_ServiceResponse[1]/SalesCommission} Profit after Commission: {$output.Calculate_Margin_ServiceResponse[1]/MarginAfterCommission} </html> return serialize($doc)
•If the HTML content contains valid XML, use the XQuery function util:toXML in the Expression Editor to serialize the content into the String format.
•If the HTML content does not contain valid XML, you must convert the HTML content to valid XML. Sometimes even if the HTML content contains valid XML, you do not receive a valid formatted email. In this case, you must use the String Concat function.
•If the HTML content does not contain valid XML, you must either convert the HTML content to valid XML or use the String Concat function with an escape character.
The following example shows how to use the String Concat function with an escape character:
fn:concat("<html>
<head>TaskDetails</head>
<body>
<table>
<tbody>
<tr>
<td>Task Name</td>
<td>Job Id</td>
<td>Status</td>
<td>Start Time</td>
<td>End Time</td>
</tr>
<tr>
<td>Williams</td>
<td>John</td>
</tr>
</tbody>
</table>
</body>
</html>","")
Command Task step
Use a Command Task step to run shell scripts or batch commands from multiple script files on the Secure Agent machine.
For example, you can use a command task to move a file, copy a file, zip or unzip a file, or run clean scripts or SQL scripts as part of a taskflow. You can use the Command Task outputs to orchestrate subsequent tasks in the taskflow.
When you add a Command Task step to a taskflow, you configure the following properties:
General Properties
In the general properties, you can specify a descriptive name for the Command Task step.
The name can contain only alphanumeric characters, underscores (_), spaces, and Unicode characters. The name can't contain curly brackets {}.
Input Fields
You can use the following system variables in the input fields to define the script file name, input argument, and work directory:
•$PMRootDir
•$PMLookupFileDir
•$PMSessionLogDir
•$PMBadFileDir
•$PMCacheDir
•$PMStorageDir
•$PMTargetFileDir
•$PMSourceFileDir
•$PMExtProcDir
•$PMTempDir
•$PMWorkflowLogDir
These variables are pre-defined for the Data Integration Server service in Administrator. To use the system variables in the Command Task step, the Common Integration Components and Data Integration Server services must be enabled and up and running on the Secure Agent.
You can use environment variables in the input fields to define the script file name, input arguments, and work directory. To use environment variables, the Secure Agent must have the Common Integration Components service version 14 or later and the command executor package version 140 or later.
Configure the following input fields for a Command Task step:
Fail task if any script fails
When you use multiple script files, you can configure a Command Task step to fail if any script fails.
When you configure one or more scripts in a Command Task step and select this option, if any script fails, the status of the Command Task step is set to failed and the taskflow does not run further scripts or the subsequent steps.
When you configure one or more scripts in a Command Task step and disable this option, the taskflow runs all the scripts. If any script fails, the status of the failed script is set to failed and the status of the Command Task step is set to warning on the All Jobs, Running Jobs, and My Jobs pages. However, Data Integration treats the Command Task step execution as a success.
This option is disabled by default.
Runtime Environment
Required. The runtime environment that runs the command. This selection can be for a runtime environment or serverless runtime environment.
Use one of the following options to specify the value for this field:
- Content. Select a Secure Agent group.
Note: Sub-organization users must have the Designer role to view the available Secure Agent groups.
- Field. Select the field that the Taskflow Designer uses to write the runtime environment into this field when this step executes. You can select a runtime environment that was added as an input field or temporary field in any other step in the taskflow. The value must be a valid secure agent group name.
- Formula. Create an expression for the runtime environment with a valid secure agent group name.
Max Wait (Seconds)
In the Max Wait (Seconds) field, you can configure the maximum length of time in seconds that the Command Task step waits for the task to complete. Specify a value between 1 and 86400 seconds. Default is 86400 seconds. If the task is not completed within the maximum time specified in the field, the task stops running and the subsequent task in the taskflow starts running.
Note: If the specified value is lesser than 1 or greater than 86400, the maximum wait time is automatically set to 86400 seconds.
Scripts
In a Command Task step, you can add multiple scripts. You can configure a script by specifying the script file name, input arguments, and work directory.
To add more scripts, click the Add icon from the Scripts panel. If you have multiple scripts in the command task, you can drag and drop the scripts to reorder the scripts.
To run scripts from different Secure Agent groups, you must add separate Command Task steps for different runtime environments.
You can view all the scripts in a taskflow instance as subtasks of the command task from the All Jobs, Running Jobs, or My Jobs page. If a script fails, you can also download the log file to understand the reason for the script failure.
Configure the following input fields for a script:
Script File Name
Required. The path and name of the script file that you want to run.
In a serverless runtime environment, you must put your files in a folder named command_scripts. This folder can have subfolders. Informatica Intelligent Cloud Services synchronizes files at regular intervals within the command_scripts directory to the Secure Agent, specifically to the agent install directory apps/Common_Integration_Components/data/command/serverless/command_scripts. If you update files in the remote storage location (for example Amazon S3), Informatica Intelligent Cloud Services automatically synchronizes them to the Secure Agent.
You can use default profiles to run the scripts that contain AWS commands directly in the serverless runtime environment. For more information about running AWS commands using default profiles in the serverless runtime environment, see the following How-To library article:
Use one of the following options to specify the value for this field:
▪ Content. Enter the path to the script that you want to run.
▪ Field. Select the field that the Taskflow Designer uses to write the script file name into this field when this step executes. You can select a script file name that was added as an input field, temporary field, or output field in any other step in the taskflow.
▪ Formula. Create an expression for the script file.
You can use system variables to define the script file that you want to run.
For example, to run the script.bat file located in the root directory, you can enter the value as $PMRootDir/script.bat.
You can use an environment variable to define the script file path based on the operating system.
For example, you have a java_script.bat file located in the following directory:
C:\Scripts
You have created an environment variable named ScriptHome for the directory. To run the java_script.bat file, you can enter the value as %ScriptHome%\java_script.bat for Windows and $ScriptHome\java_script.bat for Linux.
You can also create an environment variable for the full path including the script file name.
You can provide the EFS or NFS directories mounted to the EFS or NFS file system in the serverless runtime environment to run the commands. This is useful when you migrate jobs from a Secure Agent group environment to a serverless runtime environment. You can add commands to the script file for copying files to EFS or NFS mounted directories.
For example, you can log output of a command to a file and copy the file to the mounted directories.
aws s3 ls > log_out.txt cp log_out.txt /mountDir/logfileName.txt
You can also provide the EFS or NFS mounted directories including the script file name to run the scripts.
For example, you have an aws_script1.sh file located in the following EFS or NFS directory:
mountDir/scripts
To run the aws_script1.sh file, you can enter the value as /mountDir/scripts/aws_script1.sh.
For more information about EFS and NFS, see Runtime Environments in the Administrator help.
Input Arguments
Optional. The arguments that you want to pass to the script when it is executed.
Use one of the following options to specify the value for this field:
▪ Content. Enter one or more arguments that you want to pass to the script. Enclose each argument and values in double quotes (") and separate arguments with a comma (,).
For example, if you want to execute the following command:
▪ Field. Select the field that the Taskflow Designer uses to write the input arguments into this field when this step executes. You can select an input argument that was added as an input field, temporary field, or output field in any other step in the taskflow.
▪ Formula. Create an expression for the input arguments. Enclose each argument in double quotes (") and separate arguments with a comma (,).
You can define input arguments in the following ways:
Using system variables
You can use system variables to define the input arguments that you want to pass to the script when it is executed. For example, to pass the arguments from the root directory or the temp directory, enter the following value:
"$PMRootDir","$PMTempDir"
Using environment variables
You can use an environment variable to define the input arguments based on the operating system. For example, you have created an environment variable named ScriptHome for the following directory:
C:\Scripts
To pass the arguments from this directory, enter the value as "%ScriptHome%" for Windows and "$ScriptHome" for Linux.
Work Directory
Optional. The path to the working directory where the output of the script is saved. By default, the output is saved in the path where the script is saved.
In a serverless runtime environment, the working directory is set to /command_scripts.
Use one of the following options to specify the value for this field:
▪ Content. Enter the path to the working directory where you want to save the output of the script.
▪ Field. Select the field that the Taskflow Designer uses to write the working directory into this field when this step executes. You can select a work directory that was added as an input field, temporary field, or output field in any other step in the taskflow.
▪ Formula. Create an expression for the work directory.
You can use system variables to define the working directory where you want the output of the script to be saved.
For example, if you want to use the source file directory as the working directory, you can enter the value as $PMSourceFileDir.
You can use an environment variable to define the working directory based on the operating system.
For example, you created an environment variable named ScriptHome for the following directory:
C:\Scripts
To use this directory as the working directory, you can enter the value as %ScriptHome% for Windows and $ScriptHome for Linux.
Note: When you use curly brackets {} in the script or in the input fields, you must add an additional set of curly brackets {{}}. Otherwise, an error occurs. This is because curly brackets are considered as special characters in XQuery.
Output Fields
When you run a taskflow, the following output fields are generated for the Command Task step:
Output Field
Type
Description
Run Id
Text
The run ID of the command task.
Start Time
Date Time
The start time of the command task.
End Time
Date Time
The end time of the command task.
Exit Code
Integer
The exit code returned after command task execution. The exit code can have one of the following values:
- 0. Indicates that the command task was executed successfully.
- 1. Indicates that the command task failed.
Execution Status
Text
Displays the status of the command task as successful.
Std Error
Text
Displays the error message.
To view the values of each output field, run the taskflow and go to the Taskflow Instance Detail page in Monitor. For more information about the Taskflow Instance Detail page, see the Monitor help.
Events
Configure the Events properties to add timers to a command task.
Use a Timer event to perform an action based on a schedule. The action could be either at a specific time or after an interval.
When you add a timer to a Command Task step, a new branch appears. Add an event to this branch and specify whether you want the event to run At a specific time or After an interval.
In the following image, the event on the timer branch is a Data Decision step (Decision 1) that occurs five minutes after the main command task (Command Task 1) starts:
1Main branch
2Timer branch
In the example, the timer branch runs five minutes after Command Task 1 starts. If Command Task 1 finishes before Decision 1, the timer branch is not executed.
You can select the Interrupting option if you want the timer to interrupt the main command task.
The following image shows an interrupting timer set to occur five minutes after the main command task starts:
1Main branch
2Timer branch
In the example, Command Task 2 executes after five minutes and interrupts Command Task 1. The taskflow executes only Command Task 2 and then ends. Command Task 1 has no output in this taskflow instance. You see no output fields for Command Task 1 in the job details for the taskflow.
If Command Task 1 completes before the timer, the taskflow executes only Command Task 1 and ends.
If you delete the End step on the timer branch of an interrupting timer, the timer branch rejoins the main branch.
The following image shows an interrupting timer branch with the End step deleted:
1Main branch
2Timer branch
In the example, Command Task 2 executes after five minutes and interrupts Command Task 1. The timer branch rejoins the main branch. The taskflow executes Command Task 2, a Decision step, a Parallel Paths step, and then ends.
If Command Task 1 completes before the timer, the taskflow executes Command Task 1, a Decision step, a Parallel Paths step, and then ends.
Error Handling
Use the Error Handling tab to indicate how you want the taskflow to behave when a Command Task step encounters an error. You can also configure the taskflow behavior when the Command Task step fails or does not run.
After you select a task, configure the following error handling properties:
On Error
The path that a taskflow takes when it encounters an error in a Command Task step.
An error occurs when a Command Task step fails. You can choose from the following options:
- Select Ignore to ignore the error and move to the next step.
Note: If you select Ignore for a Command Task step with a subsequent Notification Task step and the command task fails, the email notification that you receive does not contain the fault details. To get the fault details in the email, select Custom error handling.
- Select Suspend Taskflow to move the taskflow to the suspended state when it encounters an error. You can resume the taskflow instance from the All Jobs, Running Jobs, or My Jobs page.
The taskflow resumes from the step at which it was suspended. If you know the reason for the error, correct the issue and then resume the taskflow.
- Select Custom error handling to handle the error in a manner you choose. If you select Custom error handling, two branches appear. The first branch is the path the taskflow follows if no error occurs. The second branch is the custom path the taskflow follows if an error occurs.
Default is Suspend Taskflow.
Fail taskflow on completion
The taskflow behavior when the Command Task step fails or does not run.
You can configure a taskflow to fail on its completion if the Command Task step fails or does not run. If the step fails or does not run, the taskflow continues running the subsequent steps. However, after the taskflow completes, the taskflow status is set to failed.
If you configure both the Suspend on Fault taskflow advanced property and the Fail taskflow on completion property, the Suspend on Fault property takes precedence. In this case, if the Command Task step fails or does not run, the taskflow is suspended. The taskflow does not run the subsequent steps after the Command Task step.
The following image shows a Custom error handling path with an Assignment step and another Command Task step:
Fault
The fault fields are displayed only if the command task fails due to a script failure. The details help you analyze the reason for the fault. You can then take an appropriate action on the faulted command task and proceed with the execution of the taskflow.
You can add the details in fault fields as parameters to the subsequent steps of the taskflow as shown in the following image:
If the command task has faulted, you see the following fault details:
Property
Type
Description
Run Id
Text
The run ID of the command task.
Start Time
Date Time
The start time of the command task.
End Time
Date Time
The end time of the command task.
Exit Code
Integer
The exit code returned after the command task execution. The exit code can have values between 0 and 255. The value 0 indicates that the command task ran successfully.
A failed command returns a non-zero value. The value can be based on the error type or pre-configured code in the shell script.
Execution Status
Text
Displays the status of the command task as failed.
Std Error
Text
Displays the error message.
File Watch Task step
You can add a File Watch Task step to a taskflow to listen to files in a defined location and monitor file events.
In the File Watch Task step, you can select an existing file listener with the connector source type. You can use file events to orchestrate taskflow execution. For example, you can wait for a file to arrive at a particular location and then consume the file in a subsequent step.
When you run a taskflow that contains a File Watch Task step, the associated file listener starts. When a file event occurs, the file watch task runs and sends the file event details such as a list of arrived, updated, and deleted files along with the file details. The taskflow then proceeds to the subsequent steps.
If a file event does not occur, by default, the taskflow waits for 5 minutes or for the overridden value defined in the Time Out field. After that, the File Watch Task step completes and the taskflow proceeds to the subsequent steps.
The maximum value that you can define in the Time Out field is 7 days. After 7 days, the taskflow status changes to suspended.
The following sections describe the File Watch Task step properties:
General properties
In the general properties, you can specify a descriptive name for the File Watch Task step.
The name can contain only alphanumeric characters, underscores (_), spaces, and Unicode characters. The name can't contain curly brackets {}.
File Watch Task step properties
In the File Watch Task step properties, you can select an existing file listener with the connector source type that you want to add to the taskflow. When you select a file listener, you see a description and output fields for the file listener.
The Output Fields section shows the fileEvents field. The fileEvents field is a list of objects that return the file event details such as a list of arrived, updated, and deleted files along with the file details. The fileEvents field displays a maximum of 1,500 records even if the total number of processed files exceeds 1,500.
Input fields
You can add input fields to override the following file listener properties:
•Notify if files exist on first run
•Runtime Environment
•Source Connection
•Time Out
•File Pattern
•Folder Path
Events
Configure the Events properties to add timers to a file watch task.
Use a Timer event to perform an action based on a schedule. The action could be either at a specific time or after an interval.
When you add a timer to a File Watch Task step, a new branch appears. Add an event to this branch and specify whether you want the event to run At a specific time or After an interval.
In the following image, the event on the timer branch is a Decision step (Decision 1) that occurs five minutes after the main file watch task (File Watch Task 1) starts:
1Main branch
2Timer branch
In the example, the timer branch runs five minutes after File Watch Task 1 starts. If File Watch Task 1 finishes before Decision 1, the timer branch is not executed.
You can select the Interrupting option if you want the timer to interrupt the main file watch task.
The following image shows an interrupting timer set to occur five minutes after the main file watch task starts:
1Main branch
2Timer branch
In the example, File Watch Task 2 executes after five minutes and interrupts File Watch Task 1. The taskflow executes only File Watch Task 2 and then ends. File Watch Task 1 has no output in this taskflow instance. You see no output fields for File Watch Task 1 in the job details for the taskflow.
If File Watch Task 1 completes before the timer, the taskflow executes only File Watch Task 1 and ends.
If you delete the End step on the timer branch of an interrupting timer, the timer branch rejoins the main branch.
The following image shows an interrupting timer branch with the End step deleted:
1Main branch
2Timer branch
In the example, File Watch Task 2 executes after five minutes and interrupts File Watch Task 1. The timer branch rejoins the main branch. The taskflow executes File Watch Task 2, a Decision step, a Parallel Paths step, and then ends.
If File Watch Task 1 completes before the timer, the taskflow executes File Watch Task 1, a Decision step, a Parallel Paths step, and then ends.
Error handling properties
Use the Error Handling tab to indicate how you want the taskflow to behave when a File Watch Task step encounters an error. After you select a file listener, you can configure the On Error property.
The On Error property defines the path that a taskflow takes when it encounters an error in a File Watch Task step. An error occurs when a File Watch Task step fails. You can choose from the following options:
•Select Ignore to ignore the error and move to the next step.
•Select Suspend Taskflow to move the taskflow to the suspended state when it encounters an error. You can resume the taskflow instance from the All Jobs, Running Jobs, or My Jobs page. The taskflow resumes from the step at which it was suspended. If you know the reason for the error, correct the issue and then resume the taskflow.
•Select Custom error handling to handle the error in a manner you choose. If you select Custom error handling, two branches appear. The first branch is the path that the taskflow follows if no error occurs. The second branch is the custom path that the taskflow follows if an error occurs.
The following image shows a custom error handling path with a File Watch Task step:
Default is Suspend Taskflow.
Ingestion Task step
Use an Ingestion Task step to leverage a file ingestion and replication task for taskflow orchestration. In the Ingestion Task step, you can select an existing file ingestion and replication task.
Note: File ingestion and replication tasks that use a file listener component as the source are not available for selection.
You might want to perform data integration operations after moving files to an intermediate location and before transferring the files to the target. In this scenario, you can use the Ingestion Task step in conjunction with the Data Task step.
For example, you can use the Ingestion Task step to read a large number of files from a source location and write them to an intermediate location. You can then use the Data Task step to perform data integration operations on the files and use another Ingestion Task step to write the updated files to the final target location.
The following sections describe the Ingestion Task step properties:
General properties
In the general properties, you can specify a descriptive name for the Ingestion Task step.
The name can contain only alphanumeric characters, underscores (_), spaces, and Unicode characters. The name can't contain curly brackets {}.
Ingestion Task step properties
In the Ingestion Task step properties, you can select the file ingestion and replication task that you want to add to the taskflow. When you select a file ingestion and replication task, you see a description and output fields for the file ingestion and replication task.
The Output Fields section shows the following fields:
Property
Description
Success Files
Number of files successfully transferred to the target.
Error Files
Number of files not transferred to the target.
fileDetails
A list of objects that return the file transfer details such as a list of created, updated, and deleted files along with the file count. The fileDetails field displays a maximum of 1,500 records even if the total number of processed files exceeds 1,500.
Input fields
You can add input fields to override the following file ingestion and replication task properties:
•General properties:
- Maximum File Limit. The maximum number of records to display in the fileDetails field in the Output Fields section. When you select a file ingestion and replication task in the Ingestion Task step, the Maximum File Limit field is added by default. The maximum number you can specify is 1500. Default is 0.
- Runtime Environment. Runtime environment that contains the Secure Agent used to run the task.
- Source Connection. The connection that the file ingestion and replication task uses to read from the source.
- Target Connection. The connection that the file ingestion and replication task uses to write to the target.
•Source properties:
- Batch Size. The number of files that a file ingestion and replication task can transfer in a batch.
- File Pattern. The file name pattern to use for selecting the files to transfer.
- Folder Path. The folder path from where the files are transferred. The default value is the folder path specified in the connection.
- Source Directory. The directory from where the files are transferred. The default value is the source directory specified in the connection.
•Target properties:
- Folder Path. The folder path to which the files are transferred. The default value is the folder path specified in the connection.
- Target Directory. The directory to which the files are transferred. The default value is the target directory specified in the connection.
Events
Configure the Events properties to add timers to an ingestion task.
Use a Timer event to perform an action based on a schedule. The action could be either at a specific time or after an interval.
When you add a timer to an Ingestion Task step, a new branch appears. Add an event to this branch and specify whether you want the event to run At a specific time or After an interval.
In the following image, the event on the timer branch is a Decision step (Decision 1) that occurs five minutes after the main ingestion task (Ingestion Task 1) starts:
1Main branch
2Timer branch
In the example, the timer branch runs five minutes after Ingestion Task 1 starts. If Ingestion Task 1 finishes before Decision 1, the timer branch is not executed.
You can select the Interrupting option if you want the timer to interrupt the main ingestion task.
The following image shows an interrupting timer set to occur five minutes after the main ingestion task starts:
1Main branch
2Timer branch
In the example, Ingestion Task 2 executes after five minutes and interrupts Ingestion Task 1. The taskflow executes only Ingestion Task 2 and then ends. Ingestion Task 1 has no output in this taskflow instance. You see no output fields for Ingestion Task 1 in the job details for the taskflow.
If Ingestion Task 1 completes before the timer, the taskflow executes only Ingestion Task 1 and ends.
If you delete the End step on the timer branch of an interrupting timer, the timer branch rejoins the main branch.
The following image shows an interrupting timer branch with the End step deleted:
1Main branch
2Timer branch
In the example, Ingestion Task 2 executes after five minutes and interrupts Ingestion Task 1. The timer branch rejoins the main branch. The taskflow executes Ingestion Task 2, a Decision step, a Parallel Paths step, and then ends.
If Ingestion Task 1 completes before the timer, the taskflow executes Ingestion Task 1, a Decision step, a Parallel Paths step, and then ends.
Error handling properties
Use the Error Handling tab to indicate how you want the taskflow to behave when an Ingestion Task step encounters an error. After you select a file ingestion and replication task, you can configure the On Error property.
The On Error property defines the path that a taskflow takes when it encounters an error in an Ingestion Task step. An error occurs when an Ingestion Task step fails. You can choose from the following options:
•Select Ignore to ignore the error and move to the next step.
•Select Suspend Taskflow to move the taskflow to the suspended state when it encounters an error. You can resume the taskflow instance from the All Jobs, Running Jobs, or My Jobs page. The taskflow resumes from the step at which it was suspended. If you know the reason for the error, correct the issue and then resume the taskflow.
•Select Custom error handling to handle the error in a manner you choose. If you select Custom error handling, two branches appear. The first branch is the path that the taskflow follows if no error occurs. The second branch is the custom path that the taskflow follows if an error occurs.
The following image shows a custom error handling path with an Ingestion Task step:
Default is Suspend Taskflow.
Subtaskflow step
When you add a Subtaskflow step, you can embed and reuse an existing taskflow.
You can use a subtaskflow to reuse the same orchestration flow across multiple branches of a taskflow or across different taskflows. You can then invoke the taskflow with different sets of parameters. The subtaskflow is published when you publish its parent taskflow.
When you have a taskflow that contains numerous steps, consider splitting the orchestration logic across multiple smaller taskflows. You can then simplify the design by using the Subtaskflow step to embed the smaller taskflows in the parent taskflow. This not only leads to modular design, but also helps with faster loading when you open the taskflow for editing.
You can define properties for the subtaskflow. The following sections describe the Subtaskflow step properties:
General properties
In the general properties, you can specify a descriptive name for the Subtaskflow step.
The name can contain only alphanumeric characters, underscores (_), spaces, and Unicode characters. The name can't contain curly brackets {}.
Subtaskflow properties
On the Subtaskflow tab, select the taskflow that you want to embed. Data Integration displays the location, description, input fields, and output fields for the embedded taskflow.
Input Fields
The Input Fields section appears when you add a subtaskflow to the taskflow.
If the taskflow contains parameters that you can override, you can add input fields. You can set properties for an input field to override Data Integration run-time parameters.
Fault handling properties
You can configure the following fault handling properties:
Catch faults
Select this option to enable fault handling for the Subtaskflow step.
By default, this option is not selected.
Fault field name
Required if you select the Catch faults option.
The name of the field that captures the fault information.
Default is faultInfo.
The name can't start with temp.<name>, input.<name>, output.<name>.
Fail taskflow on completion
Defines the behavior when the subtaskflow associated with the Subtaskflow step fails or does not run.
By default, when the subtaskflow associated with the Subtaskflow step fails, the parent taskflow also fails.
To configure a taskflow to fail on its completion when the subtaskflow fails, select the Catch faults option and the If this subtaskflow fails option. To configure a taskflow to fail on its completion when the subtaskflow does not run, select the If this subtaskflow does not run option. In these cases, when the subtaskflow fails or does not run, the taskflow continues running the subsequent steps. However, after the taskflow completes, the taskflow status is set to failed.
Note: If you configure both the Suspend on Fault taskflow advanced property and the Fail taskflow on completion property, the Suspend on Fault property takes precedence. In this case, if the subtaskflow associated with the Subtaskflow step fails or does not run, the taskflow is suspended. The taskflow does not run the subsequent steps after the Subtaskflow step.
Decision step
When you add a Decision step, you set some properties.
You can configure the following Decision step properties:
Name
The name of the Decision step. The name can contain only alphanumeric characters, underscores (_), spaces, and Unicode characters. The name can't contain curly brackets {}.
Decision
The taskflow takes a decision based on the fields, formulae, and paths you define here.
Use one of the following options to specify the path:
- Field. Select an input field, output field, or temporary field from the list of fields you define under the Start step.
Enter conditions and values that you want the Decision step to base a decision on.
The conditions available depend on the field that you select.
For example, if you select a field of type Simple > Text, the following conditions are available:
▪ Equals
▪ Starts With
▪ Ends With
▪ Starts with any of
▪ Contains
- Formula Open the formula editor to create a complex expression.
You can define a path and set appropriate conditions. You can also evaluate simple and complex expressions directly in the Decision step with no dependency on the prior steps.
When you select the Formula option and define the expression, the following conditions are available:
▪ Contains
▪ Equals
▪ Starts with
▪ Ends with
▪ Starts with any of
▪ Not equal to
▪ Less than
▪ Less than or equal to
▪ Greater than
▪ Greater than or equal to
However, you must select the appropriate conditions based on the return type of the functions used in the expression.
The following table describes the supported and not supported conditions based on the return types:
Return type
Supported conditions
Unsupported conditions
String and Boolean
Contains
Equals
Starts-with
Ends with
Starts with any of
Not equal to
Less than
Less than or equal to
Greater than
Greater than or equal to
Number and Integer
Contains
Equals
Starts-with
Ends with
Starts with any of
Not equal to
Less than
Less than or equal to
Greater than
Greater than or equal to
None
DateTime
Contains
Equals
Starts-with
Ends with
Starts with any of
Not equal to
Less than
Less than or equal to
Greater than
Greater than or equal to
Default is Field.
You can enter text values against the conditions you select.
You can add multiple conditions to a Decision step. Each condition is a potential data path.
For each path that you add, a corresponding branch appears on the UI. Drag branches to rearrange the order in which the branches appear on the UI.
Most Decision steps have an Otherwise path. This path handles execution if no data meets the conditions in your tests.
Evaluating Paths
A taskflow evaluates conditions based on the criteria you specify. Ensure that you construct paths with non-intersecting conditions.
For example, you create a Data Decision step with the following paths:
•Path 1: Field less than or equal to 100.
•Path 2: Field less than or equal to 75.
•Path 3: Field less than or equal to 25.
•Path 4: Otherwise
If the integer field for which the Data Decision step was created has a value of 25, the Data Decision step takes path 1. This is because 25 is less than 100 and path 1 is the first option.
To ensure that the Data Decision step follows the "Field less than or equal to 25" path, re-create the paths with the following criteria:
•Path 1: Integer between 0 and 25
•Path 2: Integer between 26 and 75.
•Path 3: Integer between 76 and 100.
•Path 4: Otherwise
Important: The taskflows evaluates conditions in a top-down manner. Ensure that the Otherwise branch is the last path.
A Decision step can lead to another Decision step. For example, a branch could run if an annual income exceeds $100,000. The next decision test along the same path could test if the city is Boston, or otherwise. Using this technique, you use Boolean AND logic because you base the test for the second condition on the true branch of the first condition. In this example, you use the Decision step to set the condition "Annual Revenue exceeds $100,000 AND city is Boston".
Similarly, to support Boolean OR logic, you can add a test for the second condition on any branch.
When the Data Task step of a taskflow fails, you can make decisions based on the output fields of the data task.
You can select the output fields when one of the following conditions are met:
•The On Error field is set to Ignore or Custom error handling.
•The Fail taskflow on completion option is set to If this task fails.
If you select the field as the entire data task, the Decision step takes the Is set path by default.
Parallel Paths step
When you add a Parallel Paths step, you set some properties.
You can configure the following Parallel Paths step properties:
Name
The name of the Parallel Paths step. The name can contain only alphanumeric characters, underscores (_), spaces, and Unicode characters. The name can't contain curly brackets {}.
Parallel Paths
The paths that you want the taskflow to run in parallel.
Click Add to add a new branch.
You can add multiple steps to each branch. To add steps to a branch, drag and drop a step from the pallette on the left.
You can run the same mapping task in multiple branches of the Parallel Paths step if the mapping task is configured to run simultaneously.
When you use the Jump step in conjunction with the Parallel Path step, you can only jump to another step on the same Parallel Path branch.
Keep in mind the following restrictions when you use the Jump step and the Parallel Path step together:
- If you are in a Parallel Path step, you cannot jump to a step on another branch of the same Parallel Path step.
- If you are in a Parallel Path step, you cannot jump to any step outside the Parallel Path step.
- If you are outside a Parallel Path step, you cannot jump to any step inside the Parallel Path step.
Jump step
When you add a Jump step, you configure the To field to define the target of the jump. You can select from a list of available steps.
More than one step can jump to the same target step. To see how many Jump steps have a particular step as their target, place the cursor over the arrow next to the target step.
When you use the Jump step in conjunction with the Parallel Path step, you can only jump to another step on the same Parallel Path branch.
Keep in mind the following restrictions when you use the Jump step and the Parallel Path step together:
•If you are in a Parallel Path step, you can't jump to a step on another branch of the same Parallel Path step.
•If you are in a Parallel Path step, you can't jump to any step outside the Parallel Path step.
•If you are outside a Parallel Path step, you can't jump to any step inside the Parallel Path step.
•You can't jump to any step from the fault path. You must merge the fault path back to the main taskflow path to add the Jump step.
•You can't jump to a step inside the fault path if you are outside the fault path.
Best design practices
Use the following best practices when you use the Jump step in a taskflow:
•Instead of using a Jump step from an outside flow to a step inside the fault path, design the taskflow by replacing the Jump step with the same flow as in the fault path.
•If you use a Jump step for iteration purposes and it points to a Subtaskflow step with fault handling, enclose the Subtaskflow step with fault handling inside another Subtaskflow step.
End step
An End step indicates the end of the taskflow. When execution reaches this step, the taskflow completes.
You can configure the following End step properties:
Name
The name of the step. You can edit this value. The name can contain only alphanumeric characters, underscores (_), spaces, and Unicode characters. The name can't contain curly brackets {}.
Ending Type
The default value is End of Process. You cannot edit this value.
HTTP Status
The HTTP response status code. The default value is 200 OK. You can edit this value.
Wait step
When you add a Wait step, you set some properties.
You can configure the following Wait step properties:
Name
The name of the Wait step. The name can contain only alphanumeric characters, underscores (_), spaces, and Unicode characters. The name can't contain curly brackets {}.
Wait
Properties that determine when and for how long the taskflow pauses.
Use the following criteria to decide if you want the taskflow to pause At a Specific Time or After a wait period:
- Select At a Specific Time to pause the taskflow at a particular time. Enter the Time you want the taskflow to pause at, and optionally, a Delay. The Delay value can be an integer or a field that you define.
For example, set the taskflow to pause at 2:00 am after three days. 2:00 am is the Time and three days is the Delay.
- Select After a wait period to pause the taskflow after a period. The period begins when the taskflow reaches the Wait step. Enter the Wait Period that you want the taskflow to pause for. The Wait Period value can be an integer or a field that you define.
For example, set the taskflow to pause for one hour from the time that the taskflow reaches the Wait step.
Throw step
Use a Throw step to catch a fault, return the fault details, stop the execution of the taskflow, and set the taskflow status to failed.
You can use a Throw step for the following use cases:
To catch faults in a taskflow
You can add a Throw step to the main path of a taskflow to catch faults in a taskflow and return the fault details. You cannot add a step after the Throw step because the Throw step is an interrupting step. If a fault occurs, the Throw step stops the execution of the taskflow and sets the taskflow status to failed.
For example, consider the following sample taskflow:
If a fault occurs in the Notification Task step, the Throw step is executed. The Throw step stops the execution of the taskflow and sets the taskflow status to failed.
To act as a boundary event for a specific step in a taskflow
When you enable custom error handling for a taskflow step, you can use a Throw step in the error handling path to act as a boundary event for the step. A boundary event is an event that catches an error that occurs within the scope of the step where it is defined.
You can add a Throw step to the error handling path of the following steps because they support custom error handling:
- Data Task
- Command Task
- File Watch Task
- Ingestion Task
You can also add a Throw step to the error handling path of a Subtaskflow step if you configure the Subtaskflow step to catch faults. If the taskflow that is contained within the Subtaskflow step fails, the parent taskflow also fails. When you view the execution details of the parent taskflow in Monitor, you can click the Throw step that is associated with the Subtaskflow step to understand why the subtaskflow failed.
When you add a Throw step to the error handling path, the error handling path breaks off from the main path of the taskflow. If a fault occurs, the taskflow takes the path that you define for handling the error. For example, in the error path, you might add a Notification Task step to send an email notification followed by a Throw step to catch the fault and stop the taskflow execution.
In the error handling path, you cannot add a step after the Throw step because the Throw step is an interrupting step. If a fault occurs, the Throw step stops the execution of the taskflow and sets the taskflow status to failed. The subsequent steps in the main path of the taskflow that exist after the step that is associated with the Throw step are not executed.
For example, consider the following sample taskflow:
The Subtaskflow step is configured to catch faults. If a fault occurs, an email notification is sent as configured in the Notification Task step. The Throw step then stops the execution of the taskflow, returns the fault details specifying why the subtaskflow failed, and sets the taskflow status to failed. The Data Task step is not executed.
Throw step properties
The following sections describe the Throw step properties:
General properties
In the general properties, you can specify a descriptive name for the Throw step.
The name can contain only alphanumeric characters, underscores (_), spaces, and Unicode characters. The name can't contain curly brackets {}.
Fault fields
You can configure the following fault fields:
Code
Required. Defines the code for the fault.
Use one of the following options to specify the value for this field:
- Content. Enter a code for the fault.
- Field. Select the field that the Taskflow Designer uses to write the code into this field when this step executes. You can select an input field, temporary field, output field, or fault field that was added in any other step in the taskflow.
- Formula. Open the formula editor to specify a formula that calculates the value for this field.
Default is Content.
Detail
Defines the fault details.
Use one of the following options to specify the value for this field:
- Content. Enter the fault details.
- Field. Select the field that the Taskflow Designer uses to write the fault details into this field when this step executes. You can select an input field, temporary field, output field, or fault field that was added in any other step in the taskflow.
- Formula. Open the formula editor to specify a formula that calculates the value for this field.
Default is Content.
Reason
Defines why the fault occurred.
Use one of the following options to specify the value for this field:
- Content. Enter the reason why the fault occurred.
- Field. Select the field that the Taskflow Designer uses to write the fault reason into this field when this step executes. You can select an input field, temporary field, output field, or fault field that was added in any other step in the taskflow.
- Formula. Open the formula editor to specify a formula that calculates the value for this field.