External tables store their data in locations outside of the predefined managed storage location associated with the metastore, unity catalog, or schema. An external table references an external storage path by using a LOCATION clause. For more information on external tables, see the Databricks documentation.
You can read data from and write data to external tables in the following formats in Databricks: Avro, CSV, Delta, JSON, ORC, Parquet, and Text. External tables of Parquet and CSV formats don't apply to mappings in advanced mode.
You can read or write data of the following data types to external tables:
•Array*
•Binary
•Bigint
•Boolean
•Date
•Decimal
•Double
•Float
•Int
•Map*
•Smallint
•String
•Struct*
•Tinyint
•Timestamp
*Applies only to mappings in advanced mode.
When you configure a target operation to create a new external table target at runtime, specify the path to the external table in the table location. For more information, see Create a target table at runtime