Data Access Management > Data access policies > Prerequisites for pushdown enforcement
  

Prerequisites for pushdown enforcement

To enable Data Access Management to push down data access control policies and data filter policies into your cloud data platform, complete the necessary configuration and authorization tasks.
The following table lists the types of data access policies that you can push down in each type of cloud data platform:
Cloud Data Platform Type
Data Access Policy Types
Amazon Redshift
Data access control policies and data filter policies
Databricks
Data access control policies and data filter policies
Microsoft Fabric Data Warehouse
Data access control policies and data filter policies
Microsoft Power BI
Data access control policies
Snowflake
Data access control policies and data filter policies
Once you configure your cloud data platform to support pushdown enforcement, you assign permissions to data assets.
For more information, see Assigning permissions to source systems.

Prerequisites for Amazon Redshift pushdown enforcement

You can enable Data Access Management to push down data access control policies and data filter policies into your Amazon Redshift cloud data platform.
Complete the following configuration and authorization tasks for your Amazon Redshift cloud data platform:
  1. 1Configure Amazon Redshift as a catalog source.
  2. For more information about configuring a catalog source for Amazon Redshift, see Amazon Redshift.
  3. 2Grant the following privileges to the connection associated with your Amazon Redshift cloud data platform:
  4. grant create role to role [IDMC_USER_ROLE];
    grant { { SELECT | INSERT | UPDATE | DELETE } [,...] | ALL [ PRIVILEGES ] } on [OBJECT_NAME]
    to { [IDMC_USER_ROLE] with grant option;
    To grant privileges on an object in Amazon Redshift, you must meet one of the following criteria:
    For more information about configuring connection properties to connect to Amazon Redshift, see Connect to Amazon Redshift.
Note: If your organization uses an identity provider (IdP) and pushes data access policies to Amazon Redshift, you must add a custom property for the namespace that Amazon Redshift requires to the Data Access Management Agent service. This allows the Secure Agent to map the IDMC user groups in the data access policies into the IdP-based roles created in a namespace in Amazon Redshift.
For more information about adding properties to the Data Access Management Agent service, see Data Access Management Agent service properties.

Prerequisites for Databricks pushdown enforcement

You can enable Data Access Management to push down data access control policies and data filter policies into your Databricks cloud data platform.
Complete the following configuration and authorization tasks for your Databricks cloud data platform:
  1. 1Ensure that the user identified in the catalog source connection that pushes the policies has Databricks workspace admin privileges on the catalog source.
  2. 2Configure Databricks as a catalog source.
  3. 3For data filter policies, create the following database on your Databricks cloud data platform:

  4. CREATE cdam_internal_state.default
  5. 4For data filter policies, grant the following permissions to the connection associated with your Databricks cloud data platform:

  6. GRANT CREATE FUNCTION ON SCHEMA cdam_internal_state.default TO user_or_role;
    GRANT DROP FUNCTION ON SCHEMA cdam_internal_state.default TO user_or_role;
  7. 5For each schema on which you want to apply data filter policies, grant the following permissions:
  8. GRANT MANAGE ON CATALOG catalog_name TO user_or_role;
    GRANT MANAGE ON SCHEMA catalog_name.schema TO user_or_role;
To enforce data filter policies, Data Access Management uses the following Databricks catalog by default:
cdam_internal_state
Note: You cannot apply data filter policies to views.
For more information about configuring a catalog source for Databricks, see Register a catalog source.

Prerequisites for Microsoft Fabric Data Warehouse pushdown enforcement

You can enable Data Access Management to push down data access control policies and data filter policies into your Microsoft Fabric Data Warehouse cloud data platform.
Complete the following configuration and authorization tasks for your Microsoft Fabric Data Warehouse cloud data platform:
  1. 1Configure Microsoft Fabric Data Warehouse as a catalog source.
  2. For more information about configuring a catalog source for Microsoft Fabric Data Warehouse, see Connect to Microsoft Fabric Data Warehouse.
  3. 2For each database on your Microsoft Fabric Data Warehouse workspace into which you want Data Access Management to push data access control policies and data filter policies, grant the service principal the following permissions:
  4. Alternatively, grant the following permission to users who need to grant any permission on any database in the server:
    Note: The default Microsoft Fabric Data Warehouse schema name that Data Access Management uses to manage data filter policies is CDAM_INTERNAL_STATE. If this does not comply with your organization's schema naming convention, you must use the plugin.fabric-warehouse.default.schema property to rename the schema.
  5. 3For data filter policies, additionally grant the following permission on the database:
  6. GRANT CREATE SCHEMA TO [IDMC_USER_ROLE]
    Note: If you manually created the CDAM_INTERNAL_STATE schema, add the following permission:
    GRANT ALTER ON SCHEMA::CDAM_INTERNAL_STATE TO [IDMC_USER_ROLE]
  7. 4For data filter policies, additionally grant the following permission on the database:
  8. GRANT CREATE FUNCTION TO [IDMC_USER_ROLE]
  9. 5For data filter policies, additionally grant permissions to create a security policy for row-level security on the schema in the following format:
  10. GRANT ALTER ANY SECURITY POLICY TO [IDMC_USER_ROLE]

Prerequisites for Microsoft Power BI pushdown enforcement

You can enable Data Access Management to push down data access control policies into your Microsoft Power BI cloud data platform.
Complete the following configuration and authorization tasks for your Microsoft Power BI cloud data platform:
  1. 1For each Microsoft Power BI workspace into which you want Data Access Management to push data access control policies, add the service principal as a member with the Admin permission.
  2. 2Configure Microsoft Power BI as a catalog source.
  3. For more information about configuring a catalog source for Microsoft Power BI, see Microsoft Power BI Connection Properties.
  4. 3Grant the following permissions as the Delegated type to the connection associated with your Microsoft Power BI cloud data platform:
  5. Dataset.ReadWrite.All
    Dataset.Read.All
    Workspace.ReadWrite.All
  6. 4Grant the following permissions as the Application type to the connection associated with your Microsoft Power BI cloud data platform:
  7. Group.Read.All
    GroupMember.Read.All
    Note: You must grant administrator consent in Microsoft Power BI to approve application permissions.

Prerequisites for Snowflake pushdown enforcement

You can enable Data Access Management to push down data access control policies and data filter policies into your Snowflake cloud data platform.
Complete the following configuration and authorization tasks for your Snowflake cloud data platform:
  1. 1Configure Snowflake as a catalog source.
  2. For more information about configuring a catalog source for Snowflake, see Snowflake.
  3. 2Determine which types of data access policies you would like to enforce in your Snowflake cloud data platform. You can currently enforce data access control policies and data filter policies. Each requires different permissions.
  4. 3For use with data access control policies, grant the following permissions to the connection associated with your Snowflake cloud data platform:
  5. GRANT MANAGE GRANTS ON ACCOUNT TO [IDMC_USER_ROLE];
    GRANT CREATE ROLE ON ACCOUNT TO [IDMC_USER_ROLE];
  6. 4For use with data filter policies, your Snowflake account needs to be able to enforce Snowflake's row access policies.
  7. To enforce row access policies, Data Access Management requires a Snowflake database to store the necessary objects. You can configure this database in any of the following ways: