Cheatsheet - Snowflake
Warehouse Management (Compute)
Virtual Warehouses provide the compute resources for executing queries.
Command | Description |
---|---|
CREATE WAREHOUSE my_wh; |
Creates a new, X-Small warehouse. |
CREATE WAREHOUSE my_wh WITH WAREHOUSE_SIZE = 'LARGE' AUTO_SUSPEND = 600 AUTO_RESUME = TRUE; |
Creates a configured warehouse. |
ALTER WAREHOUSE my_wh SET WAREHOUSE_SIZE = 'X-LARGE'; |
Resizes a warehouse (resizing drops cache). |
ALTER WAREHOUSE my_wh SUSPEND; |
Suspends a warehouse, stopping credit usage. |
ALTER WAREHOUSE my_wh RESUME; |
Resumes a suspended warehouse. |
DROP WAREHOUSE my_wh; |
Deletes a warehouse. |
SHOW WAREHOUSES; |
Lists all warehouses you have access to. |
Database, Schema & Table DDL (Storage)
These commands define the structure for storing your data.
- Databases & Schemas:
CREATE DATABASE my_db; CREATE SCHEMA my_schema; DROP DATABASE my_db; DROP SCHEMA my_schema; SHOW DATABASES; SHOW SCHEMAS;
- Tables:
- Create a Standard Table:
CREATE TABLE my_table ( id INT, name VARCHAR(100), created_at TIMESTAMP_NTZ DEFAULT CURRENT_TIMESTAMP() );
- Create Table As Select (CTAS):
CREATE TABLE new_table AS SELECT * FROM old_table;
- Temporary Table (Session-Specific):
CREATE TEMPORARY TABLE my_temp_table (id INT);
- Transient Table (Persists, but no Fail-safe):
CREATE TRANSIENT TABLE my_transient_table (id INT);
- Alter & Drop:
ALTER TABLE my_table ADD COLUMN email VARCHAR; ALTER TABLE my_table RENAME TO my_new_table_name; DROP TABLE my_table;
- Create a Standard Table:
Data Manipulation Language (DML)
Commands for adding, modifying, and removing data.
- Insert:
INSERT INTO my_table (id, name) VALUES (1, 'Alice'); INSERT INTO my_table (id, name) VALUES (2, 'Bob'), (3, 'Charlie');
- Update:
UPDATE my_table SET name = 'Alicia' WHERE id = 1;
- Delete:
DELETE FROM my_table WHERE id = 3;
- Merge (Upsert):
MERGE INTO target t USING source s ON t.id = s.id WHEN MATCHED THEN UPDATE SET t.name = s.name WHEN NOT MATCHED THEN INSERT (id, name) VALUES (s.id, s.name);
4. Querying Data (DQL)
-
Basic Select:
SELECT id, name FROM my_table WHERE id > 1 ORDER BY name DESC LIMIT 10;
-
Sampling Data (Quick approximate results):
-- Get a sample of approx 10% of the rows SELECT * FROM large_table SAMPLE ROW (10); -- Get a sample of exactly 1000 rows SELECT * FROM large_table SAMPLE (1000 ROWS);
-
Qualify (Filter on window function results):
-- Get the most recent record for each user SELECT * FROM user_logins QUALIFY ROW_NUMBER() OVER (PARTITION BY user_id ORDER BY login_time DESC) = 1;
5. Semi-Structured Data (JSON)
Snowflake excels at handling JSON directly in tables.
-
Prerequisites: Load JSON data into a single
VARIANT
column.CREATE TABLE json_data (raw_json VARIANT); INSERT INTO json_data SELECT PARSE_JSON('{"name":"Alice", "city":"New York", "tags":["dev", "sql"]}');
-
Querying JSON:
-- Dot notation to access elements SELECT raw_json:name, raw_json:city FROM json_data; -- Accessing array elements SELECT raw_json:tags[0] FROM json_data; -- Casting values SELECT raw_json:name::STRING AS user_name FROM json_data;
-
FLATTEN (Unnesting arrays into rows):
SELECT raw_json:name, f.value::STRING AS tag FROM json_data, LATERAL FLATTEN(input => raw_json:tags) f;
6. Time Travel & Cloning (Zero-Copy)
-
Time Travel (Querying past data):
- At a specific timestamp:
SELECT * FROM my_table AT(TIMESTAMP => '2023-10-27 10:00:00'::TIMESTAMP_LTZ);
- Before a specific time:
SELECT * FROM my_table BEFORE(TIMESTAMP => '2023-10-27 10:00:00'::TIMESTAMP_LTZ);
- Before a specific query ID:
SELECT * FROM my_table BEFORE(STATEMENT => 'query_id');
- Offset from now:
SELECT * FROM my_table AT(OFFSET => -60*5); -- 5 minutes ago
- At a specific timestamp:
-
Undrop a Table:
UNDROP TABLE my_table; ```* **Zero-Copy Cloning (Create instant metadata copies):** ```sql -- Create a new dev database identical to prod, without duplicating data CREATE DATABASE my_db_dev CLONE my_db_prod; -- Clone a table to a specific point in time for analysis CREATE TABLE my_table_backup CLONE my_table AT(OFFSET => -3600); -- 1 hour ago
7. Data Loading & Unloading
-
Stages (Locations for data files):
-- Create an S3 stage CREATE STAGE my_s3_stage URL='s3://my-bucket/path/' CREDENTIALS=(AWS_KEY_ID='xxx' AWS_SECRET_KEY='yyy'); -- List files in a stage LIST @my_s3_stage;
-
Loading Data from a Stage:
COPY INTO my_table FROM @my_s3_stage/data.csv FILE_FORMAT = (TYPE = CSV FIELD_OPTIONALLY_ENCLOSED_BY = '"' SKIP_HEADER = 1);
-
Unloading Data to a Stage:
COPY INTO @my_s3_stage/unload/ FROM my_table FILE_FORMAT = (TYPE = PARQUET);
8. Access Control (Users & Roles)
Command | Description |
---|---|
CREATE ROLE analyst; |
Creates a new role. |
GRANT ROLE analyst TO USER jdoe; |
Assigns a role to a user. |
GRANT USAGE ON DATABASE my_db TO ROLE analyst; |
Allows role to "see" the database. |
GRANT USAGE ON SCHEMA my_schema TO ROLE analyst; |
Allows role to "see" the schema. |
GRANT SELECT ON TABLE my_table TO ROLE analyst; |
Allows role to run SELECT on a table. |
GRANT USAGE ON WAREHOUSE my_wh TO ROLE analyst; |
Allows role to use a warehouse. |
REVOKE SELECT ON TABLE my_table FROM ROLE analyst; |
Removes a specific privilege. |
9. Session & Context
Command | Description |
---|---|
USE ROLE SYSADMIN; |
Switch to a different role. |
USE WAREHOUSE my_wh; |
Set the active warehouse for the session. |
USE DATABASE my_db; |
Set the active database. |
USE SCHEMA my_schema; |
Set the active schema. |
SELECT CURRENT_USER(), CURRENT_ROLE(), CURRENT_WAREHOUSE(); |
Get current context information. |