Enterprise Data Catalog Scanner Configuration Guide > Configuring Cloud Resources > Snowflake
  

Snowflake

Snowflake is an analytic data warehouse provided as Software-as-a-Service (SaaS). Snowflake provides a data warehouse that is faster, easier to use, and far more flexible than the traditional data warehouse offerings. The Snowflake data warehouse uses a SQL database engine with a unique architecture designed for the Cloud.

Objects Extracted

The Snowflake resource extracts metadata from the following assets in a Snowflake data source:
The following Snowflake objects are not extracted by the Snowflake resource:

Connect to a Snowflake Data Source Enabled for SSL

To connect to a Snowflake data source enabled for SSL, perform the following steps:
  1. 1. Download the following Snowflake SSL certificates using a web browser:
  2. Note: Make sure that you import the Snowflake Trust Services certificate in the Certificates directory.
  3. 2. Download the certificates from the Cloud platform on which the Snowflake account is hosted. The Snowflake account is hosted on one of the following Cloud platforms:
  4. 3. Copy the certificates to the <INFA_HOME>/services/shared/security/ directory.
  5. 4. Go to the <INFA_HOME>/source/java/jre/bin directory and run the following keytool command to import each copied certificate as a trusted certificate in the Informatica domain keystore:
  6. keytool -import -file <INFA_HOME>/services/shared/security/<certificate>.cer -alias <alias name> -keystore <INFA_HOME>/services/shared/security/infa_truststore.jks -storepass <Informatica domain keystore password>

Resource Connection Properties

The General tab includes the following properties:
Property
Description
Account
Name of the Snowflake account.
Additional parameters
Specify one or more JDBC connection parameters in the following format:
<param1>=<value>&<param2>=<value>&<param3>=<value>
For example:
user=jon&warehouse=mywh&db=mydb&schema=public
Password
Password associated with the Snowflake user account.
Role
The Snowflake role assigned to the user
User
The user name to connect to the Snowflake account.
The following table describes the properties that you can configure in the Source Metadata section of the Metadata Load Settings tab:
Property
Description
Enable Source Metadata
Enables metadata extraction.
Case Sensitive
Specifies that the resource is configured for case insensitivity. Select one of the following values:
True. Select the check box to specify that the resource is configured as case sensitive.
False. Clear the check box to specify that the resource is configured as case insensitive.
The default value is True.
Schema
Select the database schemas on which you want to apply the constraint.
Memory
The memory value required to run a scanner job.
Specify one of the following memory values:
  • - Low
  • - Medium
  • - High
Note: For more information about the memory values, see the Tuning Enterprise Data Catalog Performance article on How To-Library Articles tab in the Informatica Doc Portal
JVM Options
JVM parameters that you can set to configure scanner container. Use the following arguments to configure the parameters:
  • - -Dscannerloglevel=<DEBUG/INFO/ERROR>. Changes the log level of scanner to values, such as DEBUG, INFO, or ERROR. Default value is INFO.
  • - -Dscanner.container.core=<No. of core>. Increases the core for the scanner container. The value must be a number.
  • - -Dscanner.yarn.app.environment=<key=value>. Key value pair that you need to set in the Yarn environment. Use a comma to separate the multiple key value pairs.
  • - -Dscanner.pmem.enabled.container.memory.jvm.memory.ratio=<1.0/2.0>. Increases the scanner container memory when pmem is enabled. Default value is 1.
Track Data Source Changes
View metadata source change notifications in Enterprise Data Catalog.
You can enable data discovery for a Snowflake resource. For more information about enabling data discovery, see the Enable Data Discovery topic.
You can enable composite data domain discovery for a Snowflake resource. For more information about enabling composite data discovery, see the Composite Data Domain Discovery topic.