SQL command | Description |
---|---|
ALTER TABLE | Add a column. |
CAN ATTACH TO | Attach a user to a specific cluster in Databricks using the Databricks token. This permission applies only to a job cluster. |
COPY INTO | Load data into target (ecosystem SQL ELT with CSV as source). |
CREATE TABLE | Create a managed table. |
CREATE TABLE ... LOCATION <EXTERNAL LOCATION> | Create an external table. |
CREATE VIEW | Create a view. |
CREATE VOLUME | Create a volume in the schema. |
DELETE FROM | Empty a table. |
DROP TABLE | Delete a table. |
DROP VIEW | Delete a view. |
DROP VOLUME | Delete a volume. |
INSERT INTO | Use Insert operation with Write Disposition option set to Append. |
INSERT OVERWRITE INTO | Use Insert operation with Write Disposition option set to Truncate. |
MERGE INTO | Use Update, Upsert, and Delete operations. |
READ VOLUME | Read files and directories stored inside a volume. |
SELECT FROM | Read from a table. |
SELECT ON FILE | Retrieve data from a file staged in either Amazon S3, ADLS Gen2, or a personal staging location using the SELECT FROM command. Here are examples of the staging locations where you use the SELECT FROM command to retrieve data: Read from a file in Amazon S3: INSERT INTO `tgt_table` (col1, col2) SELECT col1, col2 FROM parquet.`s3a://staging.bucket/infa_tmp-20231120_062134_910212391` Read from a file in ADLS Gen2: INSERT INTO `tgt_table` (col1, col2) SELECT col1, col2 FROM parquet.`abfss://blob@storage_account.dfs.core.windows.net/infa_tmp-20231120_070825_17050631/` Read from a file in personal staging location: INSERT INTO `tgt_table` (col1, col2) SELECT col1, col2 FROM parquet.`stage://tmp/username/infa_tmp-20231120_025852_313310201` |
WRITE VOLUME | Add, remove, or modify files and directories stored inside a volume. |