Databricks Connector > Data type reference > Databricks and transformation data types
  

Databricks and transformation data types

The following table compares the Databricks native data type to the transformation data type:
Databricks Data Type
Transformation Data Type
Range and Description
Array1
Array
Unlimited number of characters.
Binary
Binary
1 to 104,857,600 bytes.
Bigint
Bigint
-9,223,372,036,854,775,808 to +9,223,372,036,854,775,807.
8-byte signed integer.
Boolean
Integer
1 or 0.
Date
Date/Time
Date and time values.
Decimal
Decimal
Exact numeric of selectable precision
For mappings, the maximum precision is 28 and the scale is 27.
For mappings in advanced mode, the maximum precision is 38 and the scale is 37.
Double
Double
Precision 15.
Float
Double
Precision 7.
Int
Integer
-2,147,483,648 to +2,147,483,647.
Map1
Map
Unlimited number of characters.
Smallint
Integer
-32,768 to +32,767.
String
String
1 to 104,857,600 characters.
Struct1
Struct
Unlimited number of characters.
Tinyint
Integer
-128 to 127
Timestamp
Date/Time
January 1,0001 00:00:00 to December 31,9999 23:59:59.997443.
Timestamp values only preserve results up to microsecond precision of six digits. The precision beyond six digits is discarded.
Varchar
String
1 to 104,857,600 characters.
To use the precision of the Varchar data type set in Databricks and not the precision of the String transformation data type, set the -Ddatabricks.honorNativePrecision=true property for the Secure Agent.
For more information on how to configure the property, see the Varchar data type Knowledge Base article.
1Applies only to mappings in advanced mode. Doesn't apply to job cluster, all-purpose cluster, and SQL ELT optimization.

Rules and guidelines for data types

Consider the following rules and guidelines for data types: