Consider the following rules and guidelines for Snowflake objects used as sources, targets, and lookups in mappings:
•You can read or write data of Binary data type, which is in Hexadecimal format.
•You cannot write semi-structured data to the target. For example, XML, JSON, AVRO, or PARQUET data.
•You cannot specify more than one Pre-SQL or Post-SQL query in the source or target transformation.
•The agent reads or writes the maximum float value, which is 1.7976931348623158e+308, as infinity.
•If a Snowflake lookup object contains fields with String data type of maximum or default precision and the row size exceeds the maximum row size, the task fails.
•You can use the following formats to specify filter values of Datetime data type:
- YYYY-MM-DD HH24:MI:SS
- YYYY/MM/DD HH24:MI:SS
- MM/DD/YYYY HH24:MI:SS
•When you provide a warehouse name in the connection properties and the mapping properties, the warehouse name in the mapping overrides the warehouse name you specify in the connection. Even though you provide an incorrect warehouse name in the connection properties, the connection is successful. However, before you run the mapping, ensure that you specify the correct warehouse name in the mapping properties.
•When you use a SQL override query to override the custom query used for importing the metadata from Snowflake tables, you must specify a fully qualified table name.
•You can read or write to Snowflake tables whose table name or field name can contain uppercase, lowercase, and mixed case alphabets, including numbers and special characters. You cannot write data when the Snowflake table contains field names with the # and @ characters. If the Secure Agent is installed on Windows, you cannot write data to the Snowflake target table when the table names contain the following special characters: /\:*?"<>|
•When you use the Create Target option to create a table in Snowflake, special characters in the column name are replaced by the _ character.
•If the lookup condition fetches multiple matches and if the multiplicity option is set to Report error, the task does not fail.
•When you add an advanced filter condition for the source object and the condition contains the "/" separator to separate the database name, schema name, and table name, and the mapping fails with an SQL compilation error. You must change the slash "/"separator to a dot"." in the SQL query and run the mapping.
•When you define a relationship for multiple source objects in a Source transformation, the condition uses the "/" separator to separate the database name, schema name, and table name, and the mapping fails with an SQL compilation error. You must update the forward slash "/" separator with a dot "." in the SQL query and run the mapping.
Rules and Guidelines for Query Source Type
When you configure a SQL query, consider the following points:
•Ensure that the table name that you specify in the query to read from Snowflake contains a fully qualified table name.
•Snowflake Connector does not support the following features when you use the Query source type option:
- Filter and sort options.
- Source partitioning.
- Advanced properties, except for pre-SQL and post-SQL statements.
•When you configure a pre-SQL or post-SQL query, the query must include a fully qualified table name.
Snowflake Targets that Receive Duplicate Values for Primary Keys from Source Records
When you perform an update or delete operation in Snowflake, and if the records from the source tables contain duplicate primary keys, you must perform one of the following tasks:
•Before you import the target table, define multiple primary keys in the target table.
•Define more than one custom key for the target object using the Update Columns option in the advanced target properties.
Snowflake on Azure
You cannot use the user-defined Azure Blob staging location for creating temporary tables.