To use the Business Insights AI Agent using CDGC Metadata Discovery recipe, perform the following prerequisite tasks:
1Import Python external libraries.
2Create a catalog source with metadata and business glossaries.
3Configure environment variables.
Note: Before you run the Business Insights AI Agent using CDGC Metadata Discovery agent, you must have a Secure Agent installed. For more information, see Secure Agent installation in the Administrator help.
Step 1. Import Python external libraries
Before you run the Business Insights AI Agent using CDGC Metadata Discovery agent, import the following supported Python external libraries into the directory where your Secure Agent is installed:
•json
•requests
•PyJWT (for Snowflake)
•Cryptography (for Snowflake)
Note: If you want to use databases other than the native ones (Snowflake and Databricks), create a Python tool that might require additional libraries based on the database. For information about the recommended Python client libraries, see the official documentation of the respective database.
Step 2. Create a catalog source with metadata and business glossaries
Before using the agent, scan database tables with Metadata Command Center to create a catalog source. Annotate tables and columns with descriptions, and link business glossaries to columns to add valuable business context.
While full documentation is not mandatory, adding more metadata and associations improves the agent’s ability to select relevant information. You can link business glossaries manually or automatically using the glossary association capability during scanning.
Prepare your datasets carefully because technical column names are often unclear without additional context.
Step 3. Configure environment variables
To ensure data security, retrieve sensitive information such as passwords, API keys, and other credentials used in the Python tools exclusively from environment variables.
To configure these variables, perform the following steps:
1Create a file named custom_env_settings.sh and save it in the following directory:
2Copy the following Bash script into custom_env_settings.sh and enter the values for the environment variables. If you don't use Databricks or Snowflake, leave their related variables unset: