The October 2025 release of Data Integration includes the following new features and enhancements.
Watch the What's New video to learn about new features and enhancements in the October 2025 release.
Data transfer tasks
As of February 2026, we've rolled out the following updates to data transfer tasks as part of the October 2025 release:
•When you configure a source or target in a data transfer task you can edit the field data type, precision, and scale. You can also refresh fields to get the latest metadata from the object.
•When you configure advanced attributes for a source, lookup source, or target connection, you can provide a parameter in place of the attribute and specify the value to use at run time in a parameter file.
For more information about data transfer tasks, see Tasks.
For more information about parameters and parameter files, see Mappings.
Synchronization task conversion
As of February 2026, we've rolled out the following enhancements to synchronization task conversion as part of the October 2025 release.
When you convert a synchronization task that contains edited field metadata to a data transfer task, Data Integration copies the metadata changes to the new data transfer task. Data Integration also tags the new data transfer task with the tag “FIELD METADATA EDITED”.
For more information, see Tasks.
Public REST APIs for mapplets
As of November 2025, we've rolled out the following new feature as part of the October 2025 release:
You can expose a mapplet that includes the following data quality transformations as an API and design a custom endpoint for the API:
•Cleanse
•Labeler
•Parse
•Rule Specification
The mapplet that you expose can additionally include a Filter transformation and a range of expressions that you can specify in an Expression transformation or in a Rule Specification transformation.
You can now create up to 100 arguments in user defined functions. Previously, you could only create up to 10 arguments.
For more information, see Components.
Convert synchronization tasks to data transfer tasks
You can use the convert resource to convert your existing synchronization tasks to data transfer tasks.
Use a GET request with the following URI to find out if a synchronization task can be converted to a data transfer task:
<serverURL>/dtt/dss/convert/test/{assetId}
Use a POST request with the following URI to convert the synchronization task to a data transfer task:
<serverURL>/dtt/dss/convert/{assetId}
For more information, see the REST API help.
Data transfer tasks
You can augment source data with data from up to five lookup sources. When you configure a lookup source, you can select the fields to return, rename the fields, and specify how the task handles multiple return values.
You can also configure the following runtime properties for the task:
•Pre- and post-processing commands
•Parameter file name
•Maximum number of log files to retain
•Execution mode
For more information, see Tasks.
Data preview jobs
When you run a data preview job in a mapping with one or more associated mapping tasks, you can select a mapping task to use as the basis for the preview. When you select a mapping task, the preview job runs with design time parameters, parameter file values, and advanced session properties configured in the task.
For more information, see Mappings.
Data Vault mapping and mapping task templates
You can use preconfigured mapping and mapping task templates to build Data Vault hub, link, and satellite tables.
Intelligent structure models
You can parse image-based PDF files when you create an intelligent structure model using the custom AI engine.
For more information, see Components.
Mapping task details
The Task Details page for a mapping task contains the following new features:
•The Transformations with Connections section displays general and advanced properties for transformations that use a connection.
•You can view, copy, and download the SQL queries used in the task.
The Task Details page is also updated with a new look and feel.
For more information, see Tasks.
Override source partitions using a parameter file
When you configure partitions for a parameterized flat file or relational source, you can override the partitions using a parameter file.
For more information, see Mappings.
Special characters in field names
When a mapping preserves special characters in source and target field names, field names can now include the following special characters:
. + - = ~ ` ! $ % ^ & * ( ) [ ] { } ' \ / ? < > |
For more information, see Mappings.
Taskflows
Taskflows include the following new features and enhancements:
• In a Data Task step that uses a mapping task, you can override the default data type, scale, and precision of the source fields in FTP and SFTP connections. Use the new Source_defaultNativeDataType, Source_defaultScale, and Source_defaultPrecision fields in the Source list to override the default data type, scale, and precision of the source fields.
•
If you are assigned the Administrator, Deployer, Designer, or Operator role, you can use the update resource to update the values of the API Name, Allowed Users, and Allowed Groups fields. Use the request body to pass the values. The updates applied to one or more taskflows using the update resource.
• You can use the subtaskDetails optional parameter in the taskflow status API URL to return only the taskflow status without any task or step statuses within the taskflow.
Use the following URL to return only the taskflow status:
Provide the Yes or No value in the optional parameter to return only the taskflow status. Set the value of the subtaskDetails parameter to No to return only the taskflow status.