Secure Agent Services > Data Access Management Agent service > Data Access Management Agent service properties
  

Data Access Management Agent service properties

To change or optimize the behavior of the Data Access Management Agent service, configure its properties in the System Configuration Details section when you edit a Secure Agent.
The following image shows the Data Access Management Agent service properties:
When you view or edit a Secure Agent and select the Data Access Management Agent, the Data Access Management Agent properties appear in the System Configuration Details area.
You can configure the following system properties of the Data Access Management Agent service in the System Configuration Details section:
Type
Name
Description
Sample Value
Default Value
AGENT
enableFileBasedAudit
Set to true to generate audit logs. Leave as false to prevent audit log generation.
true
false
AGENT
customFileBasedAuditPath
You can specify a directory path to which audit logs are written. The default path is the path in which you installed the Secure Agent.
log/audit/
not set
AGENT
usernameWithConnectionPrivileges
You must specify a user name from the tenant that has the read Connection privilege. This user name is used to retrieve the connection configuration and credentials in the runtime service required by the agent.
jdoe
Note:
The
Data Access Management
Agent and Proxy services will not start without valid values for the usernameWithConnectionPrivileges and userWithConnectionPrivileges properties.
not set
AGENT
pullChangesBatchSize
The maximum number of updates in a batch to process.
50
100
AGENT
pollingPeriod
The period for polling for new updates.
1h
5m
AGENT
pingPeriod
How often to ping the runtime service to indicate that the agent is still up and consuming updates.
45s
1m
AGENT
plugin.databricks.default.useIsAccountGroupMember
plugin.databricks.<connection-id>.useIsAccountGroupMember
Grants data access based on group membership.
plugin.databricks.default.useIsAccountGroupMember applies to all connections of type Databricks.
plugin.databricks.<connection-id>.useIsAccountGroupMember only applies to the connection specified. This overrides the default.
true
Note:
You must set this to true if plugin.databricks.default.useIsMember is set to false. Both can be true.
false
AGENT
plugin.databricks.default.useIsMember
plugin.databricks.<connection-id>.useIsMember
Grants data access based on local workspace group membership.
plugin.databricks.default.useIsMember applies to all connections of type Databricks.
plugin.databricks.<connection-id>.useIsMember only applies to the connection specified. This overrides the default.
false
Note:
You must set this to true if plugin.databricks.default.useIsAccountGroupMember is set to false. Both can be true.
true
AGENT
plugin.redshift.default.namespace
plugin.redshift.<connection-id>.namespace
If your organization uses an identity provider (IdP), this allows the Secure Agent to map the IDMC user groups in the data access policies into the IdP-based roles created in a namespace in Amazon Redshift.
plugin.redshift.default.namespace applies to all connections of type Amazon Redshift.
plugin.redshift.<connection-id>.namespace only applies to the connection specified. This overrides the default.
my_namespace
not set
AGENT
datasourceChangesParallelism
The maximum number of data source updates to process in parallel.
3
4
AGENT
maxShutdownTime
How long to wait for a graceful shutdown to complete before invoking a forced shutdown.
45s
1m
Note:
If your organization uses an identity provider (IdP) and pushes
data access policies
to Amazon Redshift, you must add a custom property for the namespace that Amazon Redshift requires to the
Data Access Management
Agent service. This allows the Secure Agent to map the IDMC user groups in the
data access policies
into the IdP-based roles created in a namespace in Amazon Redshift.
Note:
The default Microsoft Fabric Data Warehouse schema name that
Data Access Management
uses to manage data filter policies is CDAM_INTERNAL_STATE. If this name does not comply with your organization's schema naming convention, use the plugin.fabric-warehouse.default.schema property to give the schema another name.