Before you create a catalog source, ensure that you have the information required to connect to the source system.
Complete the following prerequisite tasks:
•Assign the required permissions.
• To scan procedures and functions, set the following roles depending on permissions:
- With administrator permission. In the JDBC connect string, use: role=<procedure owner> orrole=SYSADMIN
Only the SYSADMIN can scan procedures or functions defined with the EXECUTE AS OWNER statement.
- Without administrator permission. Set the USAGE privileges for all procedures and functions. Create procedures with the EXECUTE AS CALLER statement.
•Configure a connection to the Snowflake source system in Administrator.
•Optionally, if you want to identify pairs of similar columns and relationships between tables within a catalog source, import a relationship inference model.
Verify permissions
To extract metadata and to configure other capabilities that a catalog source might include, you need account access and permissions on the source system. The permissions required might vary depending on the capability.
Permissions to extract metadata
To extract Snowflake metadata, you need access to the Snowflake source system.
Grant read permission to the user account on all tables in the database from which you extract metadata.
Grant permissions that allow you to perform the following operations:
•select on information_schema.EXTERNAL_TABLES
•select on information_schema.FUNCTIONS
•select on information_schema.PIPES
•select on information_schema.PROCEDURES
•select on information_schema.SCHEMATA
•select on information_schema.SEQUENCES
•select on information_schema.STAGES
•select on SNOWFLAKE.ACCOUNT_USAGE.TAGS
•select on SNOWFLAKE.ACCOUNT_USAGE.TAG_REFERENCES
•show objects
•show columns
•show primary keys
•show imported keys
•show streams
•show materialized views
•show views
•show tasks
•show databases
Permissions to run data profiles
Ensure that you have the required permissions to run profiles.
Grant SELECT permissions for tables and views that you want to profile.
Permissions to perform data classification
You don't need any additional permissions to run data classification.
Permissions to perform relationship discovery
You don't need any additional permissions to run relationship discovery.
Permissions to perform glossary association
You don't need any additional permissions to run glossary association.
Permissions to perform writeback
:Ensure that you have the required permissions to perform writeback.
You need the following privileges to perform writeback:
•CREATE TAG privilege to create tags in a database or schema.
•APPLY TAG privilege to assign or modify tag values for objects.
Create a connection
When you configure a connection, you specify the connection properties for the connection. Connection properties enable an agent to connect to data sources.
1In Administrator, select Connections.
2Click New Connection.
3Enter the following connection details:
Property
Description
Connection Name
Name of the connection.
Each connection name must be unique within the organization. Connection names can contain alphanumeric characters, spaces, and the following special characters: _ . + -,
Maximum length is 255 characters.
Description
Description of the connection. Maximum length is 4000 characters.
Type
Snowflake Data Cloud
4In the Snowflake Data Cloud Properties section, select the runtime environment where you want to run the tasks. The runtime environment is either a Secure Agent or a serverless runtime environment.
5In the Connection section, select the authentication method.
You can use the following authentication methods to connect to Snowflake:
- Standard. Uses the Snowflake account user name and password credentials to connect to Snowflake.
- KeyPair. Uses the private key file and private key file password, along with the existing Snowflake account user name to connect to Snowflake.
- Client Credentials. Uses the client ID, access token URL, client secret, scope, and the access token at a minimum.
Standard authentication
This authentication method requires the Snowflake account user name and password credentials to connect to Snowflake.
The following table describes the basic connection properties for standard authentication:
Property
Description
Username
The user name to connect to the Snowflake account.
Password
The password to connect to the Snowflake account.
Account
The name of the Snowflake account.
For example, if the Snowflake URL is https://<123abc>.us-east-2.aws.snowflakecomputing.com/console/login#/, your account name is the first segment in the URL before snowflakecomputing.com. Here, 123abc.us-east-2.aws is your account name.
If you use the Snowsight URL, for example, https://app.snowflake.com/us-east-2.aws/<123abc>/dashboard, your account name is 123abc.us-east-2.aws.
Note: Ensure that the account name doesn't contain underscores. If the account name contains underscores, you need to use the alias name. To use an alias name, contact Snowflake Customer Support.
Warehouse
The Snowflake warehouse name.
Property
Description
Role
The Snowflake role assigned to the user.
Additional JDBC URL Parameters
The additional JDBC connection parameters.
You can specify multiple JDBC connection parameters, separated by ampersand (&), in the following format:
For example, you can pass the following database and schema values when you connect to Snowflake:
db=mydb&schema=public
When you add parameters, ensure that there is no space before and after the equal sign (=).
Key pair authentication
This authentication method requires the private key file and private key file password, along with the existing Snowflake account user name to connect to Snowflake.
The following table describes the basic connection properties for key pair authentication:
Property
Description
Username
The user name to connect to the Snowflake account.
Account
The name of the Snowflake account.
For example, if the Snowflake URL is https://<123abc>.us-east-2.aws.snowflakecomputing.com/console/login#/, your account name is the first segment in the URL before snowflakecomputing.com. Here, 123abc.us-east-2.aws is your account name.
If you use the Snowsight URL, for example, https://app.snowflake.com/us-east-2.aws/<123abc>/dashboard, your account name is 123abc.us-east-2.aws.
Note: Ensure that the account name doesn't contain underscores. If the account name contains underscores, you need to use the alias name. To use an alias name, contact Snowflake Customer Support.
Warehouse
The Snowflake warehouse name.
Private Key File
Path to the private key file, including the private key file name, that the Secure Agent uses to access Snowflake.
For example, specify the following path and key file name in the Secure Agent machine:
- On Windows: C:\Users\path_to_key_file\rsa_key.p8
- On Linux: /export/home/user/path_to_key_file/rsa_key.p8
To use the serverless runtime environment, specify the following path and key file name in the serverless agent directory:
For example, you can pass the following database and schema values when you connect to Snowflake:
db=mydb&schema=public
When you add parameters, ensure that there is no space before and after the equal sign (=).
Private Key File Password
Password for the private key file.
Import a relationship inference model
Import a relationship inference model if you want to configure the relationship discovery capability. You can either import a predefined relationship inference model, or import a model file from your local machine.
1In Metadata Command Center, click Explore on the navigation panel.
2Expand the menu and select Relationship Inference Model. The following image shows the Explore page with the Relationship Inference Model menu:
3Select one of the following options:
- Import Predefined Content. Imports a predefined relationship inference model called Column Similarity Model v1.0.
- Import. Imports the predefined relationship inference model from your local machine. Select this if you previously imported predefined content into your local machine and the inference model is stored on the machine.
To import a file, click Choose File in the Import Relationship Inference Model window and navigate to the model file on your local machine. You can also drag and drop the file.
The imported models appear in the list of relationship inference models on the Relationship Discovery tab.