External tables store their data in locations outside of the predefined managed storage location associated with the metastore, unity catalog, or schema. An external table references an external storage path by using a LOCATION clause. For more information on external tables, see the Databricks documentation.
You can read data from external tables of Delta, Parquet, and CSV formats in Databricks. You can write data to only external tables of Delta format in Databricks. External tables of Parquet and CSV formats don't apply to mappings in advanced mode.
You can read or write data of the following data types to external tables:
•Array*
•Binary
•Bigint
•Boolean
•Date
•Decimal
•Double
•Float
•Int
•Map*
•Smallint
•String
•Struct*
•Tinyint
•Timestamp
*Applies only to mappings in advanced mode.
When you configure a target operation to create a new external table target at runtime, specify the path to the external table in the table location. For more information, see Create a target table at runtime