You need to consider some rules for some of the configurations before you migrate assets.
Advanced filter and table name override combination
If the mapping contains an advanced filter for the object, for example, <Table2>, in the query options section and an override to the table name in the advanced properties, consider the following rules to override the table:
In mappings, consider one of the following options:
•The advanced filter in the mapping must include the $$Filtercondition value. In the mapping task, you can override the value from the parameter file in the task properties. The parameter condition in the parameter file must include the following condition: $$Filtercondition =<Table2>.id >= 1
•Use the table name that you specify as an override in the advanced properties directly in the advanced filter condition. For example, <Table2>.id >= 1
When you create mappings in advanced mode, use the table name specified as an override in the advanced properties directly in the advanced filter condition. For example, <Table2>.id >= 1
Parameterization
Consider the following rules to parameterize the object and advanced properties:
•Parameterization is not applicable when you create a new target at runtime and select the Allow parameter to be overridden at runtime option in a mapping.
•Parameterization is not applicable for mappings in advanced mode in the migration use case.
•If there are multiple pipelines configured in a mapping, do not parameterize the Databricks object. You must select a placeholder object while creating the mapping before you migrate.
Schema change handling
You cannot dynamically refresh the data object schema at runtime. You must maintain the same metadata for the table selected in the source, target, or lookup transformations and the corresponding advanced field overrides, as schema change handling is not applicable.
SQL override and custom query combination
In the mapping, when you specify an SQL override and the object type used is a custom query, ensure that the table you specified in the custom query and the SQL override contains the same metadata. The custom query and SQL override must be fully qualified. For example, Select * from Catalog1.DB1.Table1
When the mapping contains an SQL override and custom query combination, do not specify the database or table name in both the advanced source properties in the mapping and in the Databricks connection.
When you configure a partially parameterized custom query, specify the default value for the parameters. For example, when you specify Select * from $$Catalog1.$$DB1.$$Table1, add a placeholder value as the default value while creating the in-out parameters for Catalog1, DB1, and Table1.
If you have selected parameter as the source type in the mapping, do not select the Allow parameter to be overridden at runtime option. You can then select the custom query in the mapping task.
Table name override
In the mapping, if you select the custom query object type and specify a database override or table name override, Data Integration ignores the override values and does not override the database or table.
Update override
When you write data to Databricks and specify an update override for the target object, after you migrate the mapping, ensure that the target object specified in the override exists in the mapping.