When you configure a synchronization task in Salesforce, you can use the following advanced source and target options:
Advanced Salesforce Option
Description
Allow Null Updates to Target
Salesforce targets only. Indicates if null values are allowed to replace existing values in the target. Select True to allow null values to replace existing values in the target.
Default is False.
Salesforce API
API used to process Salesforce source or target data. Select one of the following options:
- Standard API. Uses the Salesforce standard API to process Salesforce data.
- Bulk API. Uses the Salesforce Bulk API to process Salesforce data.
Target Batch Size
For loading to Salesforce targets using the standard API.
The maximum number of records to include in each query that writes to the Salesforce target. Enter a number between 1 and 200.
To process multiple upserts on a particular row in the same query, set the batch size to 1.
Default is 200.
Create the Success File
For loading to Salesforce targets using the standard API.
Creates a success file for a standard API task.
Assignment Rule Selection
For loading to Salesforce Case, Lead, or Account target objects using the standard API.
Assignment rule to reassign attributes in records when you insert, update, or upsert records:
- None. Select to use no assignment rule. Default is None.
- Default. Select to use the default assignment rule set for the organization.
- Custom. Applicable only to Case and Lead objects. Select to specify and use a custom assignment rule.
Monitor the Bulk Job
For loading to Salesforce targets using the Bulk API.
Monitors a Bulk API job to provide accurate session statistics for each batch. Generates a Bulk API error file with row-level details based on information provided by the Salesforce Bulk API.
Without monitoring, the All Jobs page and session log cannot provide information about batch processing or create success and error files.
Monitoring requires additional Bulk API calls.
Create the Success File
For loading to Salesforce targets using the Bulk API with monitoring enabled.
Creates a success file with row-level details based on information provided by the Salesforce Bulk API.
Enable Serial Mode
For loading to Salesforce targets using the Bulk API.
Salesforce loads Bulk API batches to the target serially. By default, Salesforce loads Bulk API batches in parallel.
Enable Hard Delete
For loading to Salesforce targets using the Bulk API with the Delete task operation.
Permanently deletes target rows. Deleted rows cannot be recovered.
Enable PK Chunking
Use to extract from Salesforce sources when you use the Bulk API.
Enables primary key chunking to optimize performance while extracting from large data sets.
Salesforce splits the data set into a number chunks based on the record ID, creates multiple queries to extract data, and combines the result.
Salesforce supports primary key chunking for custom objects and certain standard objects. For more information about supported objects for primary key chunking, see the Salesforce documentation.
PK Chunking Size
The number of records in a chunk.
Default is 100,000. The maximum value is 250,000.
Applicable only if you select Enable PK Chunking.
PK Chunking startRow ID
The record ID from which you want to chunk the data set.
By default Salesforce applies chunking from the first record.
Applicable only if you select Enable PK Chunking.
PK Chunking Parent Object
Not applicable.
Salesforce standard API
You can use the Salesforce standard API to read data from Salesforce sources and write data to Salesforce targets. Use the standard API to process a normal amount of Salesforce data and standard reporting on the results of the standard API load.
Target batch size for the standard API
When you use the Salesforce standard API to write to Salesforce targets, you can configure the target batch size used to write data to Salesforce.
The target batch size determines the maximum number of records to include in each query that writes to the Salesforce target. Salesforce allows up to 200 records for each query. If you enter a value higher than 200, each query includes only 200 rows. Default is 200.
You might use a smaller batch size for upserts because you cannot update the same row more than once in a single query. To process multiple upserts on a particular row in the same query, set the batch size to 1.
Salesforce limits the number of queries you can make in a 24-hour period.
Success and error files for the standard API
When you use the Salesforce standard API to write to Salesforce targets, the synchronization task creates a Salesforce error file by default. You can configure the task to create a Salesforce success file. You can generate two Salesforce success files, one each with UTC timestamp and Secure Agent local timestamp.
The Salesforce success file contains one row for each successfully processed row. Each row contains the row ID, data, and one of the following task operations: Created, Updated, Upserted, or Deleted. Use the success file to track rows that are created if you need to roll back the operation.
The Salesforce error file contains one row for each row that is not written to the target. The error file contains an entry for each data error. Each log entry contains the values for all fields of the record and the error message. Use this file to understand why records did not load into the Salesforce target.
The following table describes the location and naming convention for the standard API success and error files:
To generate a success file with Secure Agent local timestamp:
1Navigate to the Schedule page of the Synchronization Task wizard.
2In the Advanced Salesforce Options area, for the Salesforce API, select Standard API.
3Select Create the Success File.
4Save your changes.
To generate the additional success file with UTC timestamp:
1Navigate to the Schedule page of the Synchronization Task wizard.
2In the Advanced Salesforce Options area, for the Salesforce API, select Standard API.
3Select Create the Success File.
4Save your changes.
5Click Administrator > Runtime Environments and select an agent.
6Click Edit.
7Select Type as DTM under System Configuration Details.
8Set the JVMOption1 as -DSFDCCreateSuccessErrorFileFromParams=true.
9Click Save to save the changes.
Salesforce bulk API
You can use the Salesforce Bulk API to read data from Salesforce sources and write data to Salesforce targets. Use the Bulk API to process large amounts of Salesforce data while generating a minimal number of API calls.
When you use the Salesforce Bulk API, Salesforce restricts the size of each batch to 10 MB of data or 10,000 records in CSV format. For example, if you want to read 10 million records, the records are split into 1000 batches and each batch contains 10000 records. If the size of the 10000 records exceeds 10 MB, the batches are split based on the size instead of the number of records.
When the synchronization task creates a batch, it adds any required characters to format the data, such as adding quotation marks around text.
You can monitor jobs that use the Bulk API to write to Salesforce targets. When you monitor a Bulk API target job, the synchronization task can create success and error files for row-level information. The synchronization task can also load batches at the same time or serially.
Monitor the bulk job
When you use the Bulk API to write to Salesforce targets, you enable the task for monitoring. With monitoring enabled, the synchronization task requests the status of each batch from the Salesforce service. The synchronization task repeats the request every 10 seconds until all batches complete. The synchronization task writes the responses from the Salesforce service to the All Jobs page and the session log. With monitoring enabled, the synchronization task also generates a Bulk API error file.
By default, the synchronization task permits monitoring for Bulk API jobs. You can configure the task to run without monitoring. Without monitoring, the All Jobs page and the session log contain information about batch creation, but do not contain details about batch processing or accurate job statistics.
Note: The synchronization task performs additional API calls when monitoring a Bulk API job. To reduce the number of API calls the synchronization task makes, do not monitor the job. For more information about batch processing, use the Job and Batch IDs from the session log to access Salesforce statistics.
Success and error files for the bulk API
When you monitor a Bulk API target job, the synchronization task generates a Bulk API error file by default. You can configure the task to create a Bulk API success file. Success and error files are CSV files that contain row-level details provided by the Salesforce service.
The Bulk API success and error files include the job ID, batch ID, Id, success, created, and error message information.
The following table describes the location and naming convention for the Bulk API success and error files:
1Navigate to the Schedule page of the Synchronization Task wizard.
2In the Advanced Salesforce Options area, for the Salesforce API, select Bulk API.
3Select Monitor the Bulk Job.
4Select Create the Success File.
5Save your changes.
Enable serial mode
When you use the Bulk API to load data to Salesforce, you can configure the task to perform a parallel load or a serial load. By default, it performs a parallel load.
In a parallel load, Salesforce writes batches to targets at the same time. Salesforce processes each batch whenever possible. In a serial load, Salesforce writes batches to targets in the order it receives them. Salesforce processes the entire content of each batch before processing the next batch.
Use a parallel load to increase performance when you do not need a specific target load order. Use a serial load when you want to preserve the target load order, such as during an upsert load.
Hard deletes
When you use the Bulk API, you can configure the task to permanently delete data from Salesforce targets.
When you use a Bulk API task to delete data from Salesforce targets, the synchronization task copies the deleted rows to the recycle bin. You can retrieve deleted data within a certain amount of time, but it also requires additional space from the hard disk.
With a hard delete, the synchronization task bypasses the recycle bin. You cannot recover data if you delete the data with the hard delete option.
Enable primary key chunking
Enable primary key chunking to increase performance when you extract data from large tables.
When you use the Bulk API to extract data from Salesforce, you can enable primary key chunking. By default, the Bulk API does not use primary key chunking.
When you enable primary key chunking, the Bulk API splits the data set into multiple chunks based on the record ID and creates extract queries for each chunk. The Bulk API combines the data when all the extract queries are complete.
Salesforce supports primary key chunking for custom objects and certain standard objects. For more information about objects that support primary key chunking, see the Salesforce documentation.
Note: You can enable primary key chunking when your Salesforce connection uses version 32 or higher of the Salesforce API. The default chunk size is 100000.