Set up change data capture with streams and scheduled tasks
✓Works with OpenClaudeYou are a Snowflake data engineer. The user wants to set up change data capture (CDC) with Snowflake Streams to track table changes and automate processing with scheduled Tasks.
What to check first
- Run
SHOW STREAMS;in your target schema to see existing streams - Verify the source table exists:
DESCRIBE TABLE <table_name>; - Check your role has
CREATE STREAMandCREATE TASKprivileges:SHOW GRANTS TO ROLE <role_name>;
Steps
- Create a standard stream on your source table with
CREATE STREAM <stream_name> ON TABLE <table_name>;— this captures INSERT, UPDATE, DELETE changes - Query the stream with
SELECT * FROM <stream_name>;to inspect the system columnsMETADATA$ACTION,METADATA$ISUPDATE, andMETADATA$ROW_NUMBER - Create a target table or staging table that will receive processed changes using
CREATE TABLE <target_table> AS SELECT * FROM <source_table> WHERE 1=0; - Write a stored procedure that consumes the stream, processes changes, and calls
TRUNCATE <stream_name>;to mark changes as consumed — useCREATE OR REPLACE PROCEDURE process_changes() RETURNS STRING LANGUAGE SQL AS $$...$$; - Create a scheduled task with
CREATE TASK <task_name> WAREHOUSE = <warehouse_name> SCHEDULE = '5 MINUTE' AS CALL process_changes(); - Resume the task with
ALTER TASK <task_name> RESUME;— tasks are created in suspended state - Monitor task execution with
SELECT * FROM TABLE(INFORMATION_SCHEMA.TASK_HISTORY(TASK_NAME='<task_name>')) ORDER BY QUERY_START_TIME DESC; - Set task dependencies if multiple tasks chain together using
CREATE TASK child_task AFTER parent_task AS...
Code
-- 1. Create a stream on the source table
CREATE STREAM sales_changes ON TABLE sales_data
APPEND_ONLY = FALSE;
-- 2. Create a staging table for processed records
CREATE TABLE sales_processed (
id INT,
customer_id INT,
amount DECIMAL(10,2),
sale_date DATE,
dml_type VARCHAR(10),
processed_at TIMESTAMP_NTZ
);
-- 3. Create a stored procedure to consume the stream
CREATE OR REPLACE PROCEDURE consume_sales_stream()
RETURNS STRING
LANGUAGE SQL
AS
$$
DECLARE
change_count INTEGER DEFAULT 0;
BEGIN
-- Insert new records from stream
INSERT INTO sales_processed (id, customer_id, amount, sale_date, dml_type, processed_at)
SELECT
id,
customer_id,
amount,
sale_date,
METADATA$ACTION AS dml_type,
CURRENT_TIMESTAMP() AS processed_at
FROM sales
Note: this example was truncated in the source. See the GitHub repo for the latest full version.
Common Pitfalls
- Treating this skill as a one-shot solution — most workflows need iteration and verification
- Skipping the verification steps — you don't know it worked until you measure
- Applying this skill without understanding the underlying problem — read the related docs first
When NOT to Use This Skill
- When a simpler manual approach would take less than 10 minutes
- On critical production systems without testing in staging first
- When you don't have permission or authorization to make these changes
How to Verify It Worked
- Run the verification steps documented above
- Compare the output against your expected baseline
- Check logs for any warnings or errors — silent failures are the worst kind
Production Considerations
- Test in staging before deploying to production
- Have a rollback plan — every change should be reversible
- Monitor the affected systems for at least 24 hours after the change
Related Snowflake Skills
Other Claude Code skills in the same category — free to download.
Snowflake SQL
Write optimized Snowflake SQL with CTEs, window functions, and semi-structured data
Snowflake dbt Models
Build dbt models, tests, and macros for Snowflake transformations
Snowflake Snowpipe
Configure continuous data ingestion with Snowpipe and external stages
Snowflake RBAC
Configure role-based access control with roles, privileges, and masking
Snowflake Stored Procedures
Write JavaScript and SQL stored procedures in Snowflake
Snowflake Data Sharing
Set up secure data sharing and data marketplace listings
Snowflake + Python
Use Snowpark for Python-based data engineering and ML in Snowflake
Want a Snowflake skill personalized to YOUR project?
This is a generic skill that works for everyone. Our AI can generate one tailored to your exact tech stack, naming conventions, folder structure, and coding patterns — with 3x more detail.