Database Ingestion and Replication > Database Ingestion and Replication targets - preparation and usage > Snowflake targets
  

Snowflake targets

Target preparation

Target preparation varies depending on whether you use the Superpipe feature for high performance streaming of data to Snowflake target tables or write data to intermediate stage files.
With Superpipe
If you use the Superpipe feature, complete the following steps:
  1. 1Create a Data Ingestion and Replication user. Use the following SQL statement:
  2. create user INFACMI_User password 'Xxxx@xxx';
  3. 2Create a new user role and grant it to the Data Ingestion and Replication user. Use the following SQL statements:
  4. create role INFACMI_superpipe;
    grant role INFACMI_superpipe to user INFACMI_User;
  5. 3Grant usage on the Snowflake virtual warehouse to the new role. Use the following SQL statement:
  6. grant usage on warehouse warehouse_name to role INFACMI_superpipe;
  7. 4Grant usage on the Snowflake database to the new role. Use the following SQL statement:
  8. grant usage on database INFACMI_DB1 to role INFACMI_superpipe;
  9. 5Create a new schema. Use the following SQL statements:
  10. use database INFACMI_DB1;
    create schema sh_superpipe;
  11. 6Grant create stream, create view, and create table privileges on the new Snowflake schema to the new role. Use the following SQL statement:
  12. grant create stream, create view, create table, usage on schema INFACMI_DB1.sh_superpipe to role INFACMI_superpipe;
  13. 7Set the default role for the newly created user. Use the following SQL statement:
  14. alter user INFACMI_User set default_role=INFACMI_superpipe;
  15. 8Define a Snowflake Data Cloud connection to the target. You must use the KeyPair option as the authentication method. See Connectors and Connections > Snowflake Data Cloud connection properties.
  16. 9Generate a private key with OpenSSL version 3.x.x and one of the following ciphers:
  17. Use the following openssl commands to generate and format the private key:
  18. openssl genrsa 2048 | openssl pkcs8 -topk8 -inform PEM -v1 AES-256-CBC -out rsa_key.p8
  19. 10Generate the public key. Use the following openssl command, in which the -in option references the file (rsa_key.p8) that contains the encrypted private key:
  20. openssl rsa -in rsa_key.p8 -pubout -out rsa_key.pub
  21. 11In Snowflake, assign the public key to the Snowflake user. Use the following SQL command:
  22. alter user INFACMI_User set rsa_public_key='key_value’;
Next step: When you create an ingestion and replication task, select the Superpipe option on the Target page of the task wizard. You can also optionally specify a Merge Frequency value, which controls the frequency at which change data rows are merged and applied to the Snowflake target table.
Without Superpipe
If you do NOT use the Superpipe feature for Snowflake targets, complete the following steps as the ACCOUNTADMIN user:
  1. 1Create a Data Ingestion and Replication user. Use one of the following SQL statements:
  2. create user INFACMI_User password 'Xxxx@xxx';
    or
    replace user INFACMI_User password 'Xxxx@xxx';
  3. 2Create a new role and grant the role to the Data Ingestion and Replication user. Use the following SQL statements:
  4. create role INFA_CMI_Role;
    grant role INFA_CMI_Role to user INFACMI_User;
  5. 3Grant usage on the Snowflake virtual warehouse to the new role. Use the following SQL statement:
  6. grant usage on warehouse CMIWH to role INFA_CMI_Role;
  7. 4Grant usage on the Snowflake database to the new role. Use the following SQL statement:
  8. grant usage, CREATE SCHEMA on database CMIDB to role INFA_CMI_Role;
  9. 5Set the default role for the newly created user. Use the following SQL statement:
  10. alter user INFACMI_User set default_role=INFA_CMI_Role;
Also, as the INFACMI_User, create a new schema:
create schema CMISchema;
Note: If the user's default role is used for ingestion and replication tasks and does not have the required privileges, the following error will be issued at runtime:
SQL compilation error: Object does not exist, or operation cannot be performed.
For Snowflake-managed Iceberg tables
Regardless of whether you use the Superpipe option, you can choose to store data and metadata in Snowflake-managed Iceberg tables in external cloud storage instead of in Snowflake tables. To prepare for the use of Iceberg tables, perform the following tasks:

Usage considerations

Using Snowflake-managed Iceberg tables to store data in external cloud storage

You can enable database ingestion and replication jobs to store data and metadata in Snowflake-managed Iceberg tables in an external cloud location instead of in Snowflake tables. The external location can be in Amazon S3, Google Cloud Storage, or Azure Storage.
The external storage of data in an open format has the following benefits:
Database Ingestion and Replication automatically creates the Snowflake-managed Iceberg tables when you deploy ingestion and replication tasks enabled for Iceberg table use. You do not need to create the Iceberg tables manually.
Snowflake manages the metadata and catalog for the Iceberg tables. Snowflake uses an external volume that stores identity and access management (IAM) information to connect to the external storage location. In Snowflake, you must create and configure an external volume beforehand. For more information, see "Target preparation" in Snowflake targets.
When you configure ingestion and replication tasks, specify the following target custom properties to enable the use of Snowflake-managed Iceberg tables:
The ingestion and replication tasks can use any load type, any apply mode, and schema drift. The Superpipe option is allowed but not required.
The following limitations apply to using Iceberg tables:

Configure private connectivity to Snowflake

You can access Snowflake using AWS or Azure Private Link endpoints.
The AWS or Azure Private Link setup ensures that the connection to Snowflake uses the AWS or Azure internal network and does not take place over the public Internet.
To connect to the Snowflake account over the private AWS network, see AWS Private Link and Snowflake.
To connect to the Snowflake account over the private Azure network, see Azure Private Link and Snowflake.