Connections > Databricks Delta connection properties > Databricks cluster
  

Databricks cluster

Configure the Spark parameters for the Databricks cluster to use Azure and AWS staging based on where the cluster is deployed.
You also need to enable the Secure Agent properties for runtime and design-time processing on the Databricks cluster.

Spark configuration

Before you connect to the Databricks cluster, you must configure the Spark parameters on AWS and Azure.

Configuration on AWS

Add the following Spark configuration parameters for the Databricks cluster and restart the cluster:
Ensure that the access and secret key configured has access to the buckets where you store the data for Databricks Delta tables.

Configuration on Azure

Add the following Spark configuration parameters for the Databricks cluster and restart the cluster:
Ensure that the client ID and client secret configured has access to the file systems where you store the data for Databricks Delta tables.

Configure Secure Agent properties

When you configure mappings, the SQL warehouse processes the mapping by default.
To process the mappings on Databricks cluster, enable the Secure Agent properties.
To connect to all-purpose cluster and job cluster, enable the Secure Agent properties for design time and runtime respectively.
Note: This topic does not pertain to Mass Ingestion.

Setting the property for design time processing

Before you can import metadata and design mappings or mappings in advanced mode, perform the following steps:
  1. 1In Administrator, select the Secure Agent listed on the Runtime Environments tab.
  2. 2Click Edit.
  3. 3In the System Configuration Details section, select Data Integration Server as the Service and Tomcat JRE as the Type.
  4. 4Edit the JRE_OPTS field and set the value to -DUseDatabricksSql=false.
This image displays the JRE_OPTS property for the Tomcat JRE type.

Setting the property for runtime processing

  1. 1In Administrator, select the Secure Agent listed on the Runtime Environments tab.
  2. 2Click Edit.
  3. 3In the System Configuration Details section, select Data Integration Server as the Service and DTM as the Type.
  4. 4Edit the JVMOption field.
    1. aTo run mappings, set the value to -DUseDatabricksSql=false. This image shows the JVMOption property for mappings.
    2. bTo run mappings enabled with SQL ELT optimization, set the value to -DUseDatabricksSqlForPdo=false. This image shows the JVMOption property for mappings enabled with pushdown optimization.