To enable Data Governance and Catalog to push down data access control policies and data filter policies into your source system, complete the necessary configuration and authorization tasks.
The following table lists the types of data access policies that you can push down in each type of source system:
Source System Type
Data Access Policy Types
Amazon Redshift
Data access control policies and data filter policies
Amazon S3
Data access control policies
Databricks
Data access control policies and data filter policies
Google BigQuery
Data access control policies
Microsoft Fabric Data Lakehouse
Data access control policies and data filter policies
Microsoft Fabric Data Warehouse
Data access control policies and data filter policies
Microsoft Power BI
Data access control policies and data filter policies
Snowflake
Data access control policies and data filter policies
Tableau
Data access control policies
Once you configure your source system to support pushdown enforcement, you assign permissions to data assets.
Prerequisites for Amazon Redshift pushdown enforcement
You can enable Data Governance and Catalog to push down data access control policies and data filter policies into your Amazon Redshift source system.
Complete the following configuration and authorization tasks for your Amazon Redshift source system:
1Configure Amazon Redshift as a catalog source.
For more information about configuring a catalog source for Amazon Redshift, see Amazon Redshift.
2Grant the following privileges to the connection user associated with your Amazon Redshift source system:
grant create role to role [IDMC_USER_ROLE]; grant { { SELECT | INSERT | UPDATE | DELETE } [,...] | ALL [ PRIVILEGES ] } on [OBJECT_NAME] to { [IDMC_USER_ROLE] with grant option;
To grant privileges on an object in Amazon Redshift, you must meet one of the following criteria:
- Be the object owner.
- Be a superuser.
- Have a grant privilege for that object and for the privilege that you'll grant.
For more information about configuring connection properties for Amazon Redshift, see Connect to Amazon Redshift.
Note:
If your organization uses an identity provider (IdP) and pushes
data access policies
to Amazon Redshift, you must add a custom property for the namespace that Amazon Redshift requires to the
Data Access Management
Agent service. This allows the Secure Agent to map the IDMC user groups in the
data access policies
into the IdP-based roles created in a namespace in Amazon Redshift.
You can enable Data Governance and Catalog to push down data access control policies into your Amazon S3 source system.
Data access control policies grant access to data based on one or more AWS customer-managed policies attached to an IAM role.
Complete the following configuration and authorization tasks for your Amazon S3 source system:
1Configure Amazon S3 as a catalog source.
For more information about configuring a catalog source for Amazon S3, see Amazon S3.
2Grant the following privileges to the connection user associated with your Amazon S3 source system:
- sts:GetCallerIdentity to resolve the AWS account ID during client initialization.
- iam:GetRole to ensure the target IAM role exists before attaching IAM policies.
- iam:CreatePolicy to allow creation of new managed IAM policies such as S3 permissions.
- iam:GetPolicy to check whether an IAM policy already exists and to retrieve its metadata.
- iam:DeletePolicy to remove managed policies when access is fully revoked.
- iam:ListPolicies to enable listing of AWS customer-managed policies, often with prefix filtering.
- iam:CreatePolicyVersion to update an existing IAM policy by creating a new version.
- iam:GetPolicyVersion to retreive the IAM policy document content from a specific version.
- iam:ListPolicyVersion to list all versions of an IAM policy, which is useful for cleanup and version management.
- iam:DeletePolicyVersion to delete old policy versions.
Note:
Amazon S3 enforces a maximum of five versions per IAM policy.
- iam:AttachRolePolicy to attach AWS customer-managed IAM policies to IAM roles to grant permissions.
- iam:DetachRolePolicy to deattach AWS customer-managed IAM policies from IAM roles to revoke permissions.
For more information about configuring connection properties for Amazon S3, see Create a connection.
Prerequisites for Databricks pushdown enforcement
You can enable Data Governance and Catalog to push down data access control policies and data filter policies into your Databricks source system.
Complete the following configuration and authorization tasks for your Databricks source system:
1Ensure that the user identified in the catalog source connection that pushes the policies has Databricks workspace admin permissions on the catalog source.
2Configure Databricks as a catalog source.
To enforce data filter policies, Data Governance and Catalog uses the following Databricks catalog by default:
cdam_internal_state
3For data filter policies, create the following catalog on your Databricks source system:
CREATE cdam_internal_state.default
Note:
You cannot apply
data filter policies
to views.
4For data filter policies, permissions to the connection associated with your Databricks source system:
GRANT CREATE FUNCTION ON SCHEMA cdam_internal_state.default TO user_or_role; GRANT DROP FUNCTION ON SCHEMA cdam_internal_state.default TO user_or_role;
5For each schema on which you want to apply data filter policies, grant the following permissions:
GRANT MANAGE ON CATALOG catalog_name TO user_or_role; GRANT MANAGE ON SCHEMA catalog_name.schema TO user_or_role;
For more information about configuring a catalog source for Databricks, see Register a catalog source.
Prerequisites for Google BigQuery pushdown enforcement
You can enable Data Governance and Catalog to push down data access control policies into your Google BigQuery source system.
Complete the following configuration and authorization tasks for your Google BigQuery source system:
1Configure Google BigQuery as a catalog source.
For more information about configuring a catalog source for Google BigQuery, see Google BigQuery in the Metadata Command Center help.
2In Google BigQuery, create a custom role with the following permissions per project:
- roles/bigquery.user to run statements on Google BigQuery
- bigquery.tables.setIamPolicy to grant or revoke roles for assets
- iam.roles.get to view role state information
- iam.roles.undelete to undelete deleted custom roles
- iam.roles.update to switch to disabled state
- iam.roles.create to create custom roles
- iam.roles.list to view role information
- Optional: If you don't want to give iam.roles.create and iam.roles.list permissions to Data Governance and Catalog, you can instead create the following custom roles in each project:
▪ CDAM read role
▪ roleId: CDAMRead
▪ permission: bigquery.tables.getData
▪ CDAM write role
▪ roleId: CDAMWrite
▪ permission: bigquery.tables.updateData
3Alternatively, grant the following roles to the Google BigQuery service account used in the connection:
- roles/bigquery.dataOwner to grant or revoke roles
- roles/iam.roleAdmin to create custom roles
For more information about configuring connection properties for Google BigQuery, see Connect to Google BigQuery.
Prerequisites for Microsoft Fabric Data Lakehouse pushdown enforcement
You can enable Data Governance and Catalog to push down data access control policies and data filter policies into your Microsoft Fabric Data Lakehouse source system.
Complete the following configuration and authorization tasks for your Microsoft Fabric Data Lakehouse source system:
1Configure Microsoft Fabric Data Lakehouse as a catalog source.
For more information about configuring a catalog source for Microsoft Fabric Data Lakehouse, see Connect to Microsoft Fabric Data Lakehouse.
2For each database into which you will push data access control policies and data filter policies, grant the service principal the following permissions:
- CREATE ROLE
GRANT CREATE ROLE ON DATABASE::[DATABASE_NAME] TO [IDMC_USER_ROLE]; GO
- ALTER ANY ROLE
GRANT ALTER ANY ROLE ON DATABASE::[DATABASE_NAME] TO [IDMC_USER_ROLE]; GO
- CONTROL
GRANT CONTROL ON DATABASE::[DATABASE_NAME] TO [IDMC_USER_ROLE]; GO
Alternatively, grant the following permission to users who need to grant any permission on any database in the server:
- CONTROL SERVER
GRANT CONTROL SERVER TO [IDMC_USER_ROLE]; GO
3For data filter policies, additionally grant the following permission on the database:
GRANT CREATE SCHEMA TO [IDMC_USER_ROLE]
GRANT CREATE FUNCTION TO [IDMC_USER_ROLE]
Note:
The default Microsoft Fabric Data Warehouse schema name that
Data Access Management
uses to manage data filter policies is CDAM_INTERNAL_STATE. If this name does not comply with your organization's schema naming convention, use the plugin.fabric-warehouse.default.schema property to give the schema another name.
Note:
If you manually created the CDAM_INTERNAL_STATE schema, add the following permission for
data filter policies
:
GRANT ALTER ON SCHEMA::CDAM_INTERNAL_STATE TO [IDMC_USER_ROLE]
4For data filter policies, additionally grant permissions to create a security policy for row-level security on the schema in the following format:
GRANT ALTER ANY SECURITY POLICY TO [IDMC_USER_ROLE]
Prerequisites for Microsoft Fabric Data Warehouse pushdown enforcement
You can enable Data Governance and Catalog to push down data access control policies and data filter policies into your Microsoft Fabric Data Warehouse source system.
Complete the following configuration and authorization tasks for your Microsoft Fabric Data Warehouse source system:
1Configure Microsoft Fabric Data Warehouse as a catalog source.
2For each database into which you will push data access control policies and data filter policies, grant the service principal the following permissions:
- CREATE ROLE
GRANT CREATE ROLE ON DATABASE::[DATABASE_NAME] TO [IDMC_USER_ROLE]; GO
- ALTER ANY ROLE
GRANT ALTER ANY ROLE ON DATABASE::[DATABASE_NAME] TO [IDMC_USER_ROLE]; GO
- CONTROL
GRANT CONTROL ON DATABASE::[DATABASE_NAME] TO [IDMC_USER_ROLE]; GO
Alternatively, grant the following permission to users who need to grant any permission on any database in the server:
- CONTROL SERVER
GRANT CONTROL SERVER TO [IDMC_USER_ROLE]; GO
3For data filter policies, additionally grant the following permission on the database:
GRANT CREATE SCHEMA TO [IDMC_USER_ROLE]
GRANT CREATE FUNCTION TO [IDMC_USER_ROLE]
Note:
The default Microsoft Fabric Data Warehouse schema name that
Data Access Management
uses to manage data filter policies is CDAM_INTERNAL_STATE. If this name does not comply with your organization's schema naming convention, use the plugin.fabric-warehouse.default.schema property to give the schema another name.
Note:
If you manually created the CDAM_INTERNAL_STATE schema, add the following permission for
data filter policies
:
GRANT ALTER ON SCHEMA::CDAM_INTERNAL_STATE TO [IDMC_USER_ROLE]
4For data filter policies, additionally grant permissions to create a security policy for row-level security on the schema in the following format:
GRANT ALTER ANY SECURITY POLICY TO [IDMC_USER_ROLE]
Prerequisites for Microsoft Power BI pushdown enforcement
You can enable Data Governance and Catalog to push down data access control policies and data filter policies into your Microsoft Power BI source system.
Note:
You must use one of the following types of Microsoft workspaces:
•Fabric
•Power BI Premium
•Power BI Premium Per User
In Metadata Command Center configure Microsoft Power BI as a catalog source.
In Microsoft Power BI, complete the following configuration and authorization tasks:
1Disable the tenant-level setting "Block republish and disable package refresh" to prevent any access issues related to Microsoft Power BI permissioning.
2For each Microsoft Power BI workspace into which you want Data Governance and Catalog to push data access control policies and data filter policies, add the user or service principal as at least a member or admin of the workspace.
3Ensure that the XMLA endpoint property is set to read-write.
4Grant the following permissions as the Delegated type to the connection user associated with your Microsoft Power BI source system:
5Grant the following permissions as the Application type to the connection user associated with your Microsoft Power BI source system:
Group.Read.All GroupMember.Read.All
Note:
You must grant administrator consent in Microsoft Power BI to approve application permissions.
6For use with data filter policies, grant the following minimum permissions:
- To allow the Data Access Management Secure Agent service to create data filter policies in tables:
Create UDF on Database.Schema cdam_control.default
- To allow the Data Access Management Secure Agent service to apply a data filter policies:
Create filter policy to <table>
Alternatively, you can grant the Data Access Management Secure Agent service the following permission:
Create Database cdam_control
Prerequisites for Snowflake pushdown enforcement
You can enable Data Governance and Catalog to push down data access control policies and data filter policies into your Snowflake source system.
Complete the following configuration and authorization tasks for your Snowflake source system:
1Configure Snowflake as a catalog source.
For more information about configuring a catalog source for Snowflake, see Snowflake.
2Determine which types of data access policies you would like to enforce in your Snowflake source system. You can currently enforce data access control policies and data filter policies. Each requires different permissions.
3For use with data access control policies, grant the following permissions to the user role associated with your Snowflake source system connection:
GRANT MANAGE GRANTS ON ACCOUNT TO [IDMC_USER_ROLE]; GRANT CREATE ROLE ON ACCOUNT TO [IDMC_USER_ROLE];
4For use with data filter policies, your Snowflake account needs to be able to enforce Snowflake's row access policies.
To enforce row access policies, Data Access Management requires a Snowflake database to store the necessary objects. You can configure this database in any of the following ways:
- The Data Access Management Secure Agent service can create it automatically, if you grant it the following permission:
GRANT CREATE DATABASE ON ACCOUNT TO [IDMC_USER_ROLE];
- You can create the database yourself by creating a database with the name CDAM_INTERNAL_STATE and granting at a minimum the following permissions to the Snowflake role that you associate with the connection you create in IDMC:
GRANT USAGE ON CDAM_INTERNAL_STATE TO [IDMC_USER_ROLE]; GRANT CREATE ROW ACCESS POLICY ON SCHEMA "CDAM_INTERNAL_STATE"."PUBLIC" TO ROLE [IDMC_USER_ROLE];
For all databases for which you want to apply row access policies, grant the following permission:
GRANT USAGE ON DATABASE [DATABASE_NAME] TO ROLE [IDMC_USER_ROLE];
Regardless of who created the database, grant the following permission to the user in the connection:
GRANT APPLY ROW ACCESS POLICY ON ACCOUNT TO [IDMC_USER_ROLE];
Prerequisites for Tableau pushdown enforcement
You can enable Data Governance and Catalog to push down data access control policies into your Tableau source system.
Complete the following configuration and authorization tasks for your Tableau source system:
1Configure Tableau as a catalog source.
For more information about configuring a catalog source for Tableau, see Tableau in the Metadata Command Center help.
2Configure the user account with the following permissions:
- Interactor license level
- View and download permissions for all projects, workbooks, and catalog sources that you want Data Governance and Catalog to act upon
Note:
Tableau automatically grants permissions from parent objects to their child objects such as tables within a database.
For more information about configuring connection properties for Tableau, see Create a Connection.