Mapping configured to read or write Date and Int96 data types for Parquet files fails
A mapping configured to read from or write to Microsoft Azure Synapse SQL where you select Parquet as the staging format in the advanced source properties, fails in the following cases:
- Data is of the Date data type and the date is less than 1582-10-15.
- Data is of the Int96 data type and the timestamp is less than 1900-01-01T00:00:00Z.
To resolve this issue, specify the following spark session properties in the mapping task or in the custom properties file for the Secure Agent:
Time zone for the Date and Timestamp data type fields in Parquet file format defaults to the Secure Agent host machine time zone.
When you run a mapping to read from or write to Microsoft Azure Synapse SQL and select Parquet as the staging format in the advanced source or target properties, the time zone defaults to the Secure Agent host machine time zone.
To change the Date and Timestamp to the UTC time zone, you can either set the Spark properties globally in the Secure Agent directory for all the tasks in the organization that use this Secure Agent, or you can set the Spark session properties for a specific task from the task properties:
To set the properties globally, perform the following tasks:
1Add the following properties to the <Secure Agent installation directory>/apps/At_Scale_Server/41.0.2.1/spark/custom.properties directory: