Data Access Management includes the following new features and enhancements in this release.
Push down data access policies to Databricks
You can use a data access policy to grant read, write, or delete access to a table or view in a Databricks data source. Data Access Management pushes down the policy for enforcement in the Databricks data source.
For more information about data access policies, see Introduction and Getting Started in the Data Access Management help.
Replace input values with hashed values
You can use the hashing data protection technique to replace raw values with hashed values.
For more information about data protection techniques, see Introduction and Getting Started in the Data Access Management help.
Replace input values with human-readable substitutes
You can use the substitution data protection technique to consistently replace raw values with your own human-readable substitutes, rather than random, tokenized values.
For more information about data protection techniques, see Introduction and Getting Started in the Data Access Management help.
Specify the number of allowed error rows before a query aborts
In the case of a tokenization error, you can now specify how many rows Data Access Management redacts with NULL before the query aborts. You do this in the maximumNumberOfTransformationFailures connection property on the data proxy JDBC driver. The default is 100 rows.