Consider the following rules and guidelines for Microsoft Fabric Data Warehouse mappings:
General guidelines
- When you configure a filter in a mapping that reads data, consider the following guidelines:
▪ If the table or schema name contains Unicode characters, the mapping fails. To resolve this issue, enclose the table name or schema name with Unicode characters in square brackets within the filter condition.
▪ If the column name starts with a number, the mapping fails.
▪ If you prefix the column name with an underscore in the filter condition, the mapping fails. You need to enter the actual column name in the filter condition.
▪ If the mapping contains an advanced filter for the object, for example, <Table2>, in the query options section and an override to the table name in the advanced properties, use the table name that you specify as an override in the advanced properties directly in the advanced filter condition. For example, <Table2>.id >= 1
- If the column or schema name in a custom query, an SQL override, or Update override field contains Unicode characters, enclose the column name or schema name with Unicode characters in square brackets.
- Ensure that the query you specify for the pre and post SQL statements are valid. Otherwise, the mapping fails.
- When you read from Microsoft Azure Synapse SQL and write to Microsoft Fabric Data Warehouse, the mapping fails.
- When you specify a custom query for a source object, ensure that the query does not contain a semicolon at the end of the SQL statement. Otherwise, the mapping fails.
- You cannot use input parameters in custom queries that read data from Microsoft Fabric Data Warehouse source objects.
- When you specify a single constant with the IIF condition in the data driven condition such as IIF(col_bigint>747654,DD_INSERT), the Secure Agent inserts the data into the target even for those rows that do not satisfy the condition.
- When you specify the DD_REJECT constant in the data driven condition, the Secure Agent does not log the rejected rows in the error file or the session log.
- When you configure a parameterized mapping with the Create New at Runtime option, enter the schema name/table name in the Object field in the Target transformation. The mapping validates the entry and throws a validation error if a table with the same name exists. However, if you enter only the table name and a table with the same name exists, the mapping does not validate the entry and inserts data to the existing table.
- A mapping that uses a MINUS operator in a custom query to read data from Microsoft Fabric Data Warehouse source objects fails at runtime. Use an EXCEPT operator instead of a MINUS operator.
- When you use a custom query in the Source transformation to call a stored procedure, the mapping fails at runtime.
- When you configure key range partitioning in the Source transformation in a mapping that reads data which contains null values, the mapping ignores the null values and does not write them to the target.
- To avoid a Java heap size error when you write to Microsoft Fabric Data Warehouse, you need to allocate more memory to the DTM in the Secure Agent properties based on the amount of data that you want to write. Increase the -Xms and -Xmx values for the DTM in the JVM options in the system configuration details of the Secure Agent. The recommended -Xms value is 512 MB, while the recommended -Xmx value is 1024 MB.
- When you write to Microsoft Fabric Data Warehouse, the mapping fails if the table name or schema name contains unsupported special characters. However, the table name or schema name can contain the following special characters:
▪ !,@,#,$,%,^,&,_,|,:,",?,>,<
- When you filter records in a read operation, you can use only the following filter operators:
▪ Equals
▪ Greater
▪ Greater_or_equals
▪ Less_or_equals
▪ Less
▪ Not_Equals
Lookup Transformation
- You cannot set the default column value for the return field in a cached Lookup transformation in a Microsoft Fabric Data Warehouse mapping in advanced mode.
- You cannot use Binary and Varbinary datatype columns as lookup conditions.
- In a mapping in advanced mode that reads from and looks up data from Microsoft Fabric Data Warehouse, ensure that the target is not a Microsoft Fabric Data Warehouse. If the target is Microsoft Fabric Data Warehouse, the mapping fails.
- When you configure a lookup data filter in a cached lookup, you cannot use a custom query as a lookup object or configure an SQL override. However, the table name override is considered when you configure a lookup data filter.
- You cannot use a custom query as a lookup object or configure an SQL override in an uncached lookup.
- If you parameterize the lookup condition, you can only use the AND operator between multiple conditions or expressions.
SQL override
- When you specify an SQL override statement to override the default SQL query that reads data from a Microsoft Fabric Data Warehouse source, ensure that the SQL query does not contain a semicolon at the end of the SQL statement. Otherwise, the mapping in advanced mode fails.
- When you use an ORDER BY clause along with Join or Union clauses in a custom query or in an SQL override statement to read data from Microsoft Fabric Data Warehouse source objects, ensure that you add "offset 0 rows" at the end of the query. Otherwise, the mapping fails.
- If the mapping contains both a filter for the source object and an SQL override statement in the advanced source properties, the query in the SQL override statement takes precedence over the filter condition.
- When you specify an SQL override statement in the Source transformation to call a stored procedure in Microsoft Fabric Data Warehouse, the mapping in advanced mode fails at runtime.
- The count and order of columns in an SQL override must match the count and order of columns in the connected fields.
Update override
- Specify the update query in the following format:
UPDATE <Target table name> SET <Target table name>.<Column1> = :TU.<Column1>, <Target table name>.<Column2> = :TU.<Column2>, … <Target table name>.<ColumnN> = :TU.<ColumnN> FROM :TU WHERE <Target table name>.<Update Column1> = :TU.<Update Column1> AND <Target table name>.<Update Column2> = :TU.<Update Column2> AND … <Target table name>.<Update ColumnN> = :TU.<Update ColumnN>
- Column names for :TU must match the target table column names.
- You must specify the update query with a valid SQL syntax because Microsoft Fabric Data Warehouse Connector replaces :TU with a temporary table name and does not validate the update query.
- You cannot use unconnected fields to the target in the update query.
Data types
- When you write data to a Microsoft Fabric Data Warehouse target created at runtime, Float or Real data types are written as Double data type in the target. To write Float or Real data types, use the Edit Metadata option to edit the data type in the Target transformation.
- When you read data of the Float or Real data type from a source and write to a target, the mapping might not accurately write the Float or Real data type to the target.
For more information about writing the Float or Real data type to Microsoft Fabric Data Warehouse, see Float and Real (Transact-SQL) in the Microsoft documentation.
Parameterization
- When you use input parameters, specify the parameter name in the following format in a mapping task: $name$.
- You can't parameterize a target created at runtime using the parameter file.
- You can't use input parameters in a parameter file to parameterize the source filter and advanced properties at runtime.
- Ensure that you create the parameter before you override it using a parameter file.
- You can't parameterize only the connection. You can either parameterize both the connection and the object or only the object.
- When you parameterize the object, ensure that the metadata of the imported object matches the metadata fetched in the override.
- When you parameterize the connection or object, ensure that the connection or object that you want to override exists in Microsoft Fabric Data Warehouse.
- When you parameterize the connection and objects using the parameter file, ensure that the object name in the parameter file does not contain the schema name. If the connection in the parameter file doesn't contain the schema name, the mapping considers the schema name configured in the mapping task. To override the schema name, use the schema override property in the advanced properties.