Snowflake Data Cloud Connector > Part III: SQL ELT with Snowflake Data Cloud Connector > Introduction to SQL ELT
  

Introduction to SQL ELT

You can enhance the mapping performance with SQL ELT.
You can read data from a cloud data warehouse and write it to the same cloud data warehouse. You can also read data from a data lake in your cloud ecosystem and write it to a cloud data warehouse in the same ecosystem. Data Integration translates the transformation logic into ecosystem-specific commands and SQL statements that run in the underlying cloud infrastructure. This increases the data processing speed because the data isn't moved out of the cloud infrastructure for processing.

Example

You work for healthcare solutions and your organization provides healthcare technology to pharmacies and pharmacy chains. You enable pharmacies to process prescriptions, store and provide access to healthcare records, and improve patient outcomes. Your organization stores its data in Amazon S3.
The management wants to create a patient-centric pharmacy management system. The organization plans to leverage the warehouse infrastructure of Snowflake and load all its data to Snowflake so that they can make operational, financial, and clinical decisions with ease.
To load data from an Amazon S3 object to Snowflake, you must use SQL ELT with the required transformations that support the data warehouse model. Use an Amazon S3 V2 connection to read data from the Amazon S3 source and a Snowflake Data Cloud connection to write to a Snowflake target. The Amazon S3 source data is uploaded to the Snowflake stage using the PUT command. The Snowflake COPY commands are used to convert the transformations to the corresponding SQL functions and expressions while loading the data to Snowflake. You can enhance the performance of the task and reduce the costs involved by configuring SQL mappings.