Snowflake Data Cloud Connector > Part III: SQL ELT with Snowflake Data Cloud Connector > Prepare for SQL ELT > Access data files in an Amazon S3 bucket
  

Access data files in an Amazon S3 bucket

You need to have access to data files in an Amazon S3 bucket to load data from Amazon S3 to Snowflake.
When you write data to multiple Snowflake targets with storage integration enabled for Amazon S3, you need to use the same Snowflake connection for all targets.
  1. 1Create a Cloud Storage Integration object that contains the details of the Amazon S3 buckets from which you want to read data.
  2. 2After you create the Cloud Storage Integration in Snowflake, specify the Cloud Storage Integration name in the Additional JDBC URL Parameters connection property.
  3. The Storage Integration value is case-sensitive.
Snowflake Data Cloud Connector creates a temporary external stage that uses the Cloud Storage Integration you created.

Configuring storage integration for Amazon S3

Create a storage integration to allow Snowflake to read data from the Amazon S3 bucket.
    1Create a Cloud Storage Integration in Snowflake.
    2Retrieve the Cloud Storage Service account for your Snowflake account.
    3Grant the service account permissions to access the bucket objects.
    1. aCreate a custom IAM role.
    2. bAssign the custom role to the Cloud Storage Service account.
    4 Grant permissions for the role to create an external stage.
    The role must have the CREATE STAGE privilege on the schema and the USAGE privilege on the storage integration.
    For example, run the following commands to grant these privileges:
    grant create stage on schema public to role myrole;
    grant usage on integration s3_int to role myrole;
    For more information about how to configure a storage integration for Amazon S3, see Snowflake storage integration to access Amazon S3 in the Snowflake documentation.