
Industry
e-commerce
Skills
approach
Tools
snowflake
Learning Objectives
Capture real-time DML operations using Snowflake Streams for auditing and change data capture.
Automate the movement and transformation of incremental data using Snowflake Tasks.
Develop an end-to-end pipeline from source table to a processed target using only SQL scripting.
Understand the practical use of Streams and Tasks in modern data engineering scenarios.
Overview
In this project, you have to implement a stream that will capture the data changes for each DML operation (INSERT, UPDATE, DELETE) performed on GlobalMart’s customers
table. You will then automate this entire pipeline using Snowflake Tasks to ensure that changes are consistently processed and stored for downstream consumption. This hands-on project simulates a real-time data tracking scenario used in modern data architectures, enabling learners to build near real-time ETL workflows. It also offers an opportunity to understand how Snowflake’s native features like Streams and Tasks work together to support automation, change data capture (CDC), and incremental data processing.
Prerequisites
- Understanding of Snowflake tables and their types (permanent, transient, temporary).
- Knowledge of how to create and load data using stages in Snowflake.
- Familiarity with setting up storage integrations for accessing external cloud storage.
- Basic to intermediate understanding of SQL scripting and procedural logic in Snowflake.