Day 1: Unlocking the Power of Snowpark-Optimized Warehouses

I’m thrilled to see you here on my blog! A big thanks to everyone for your review comments and for connecting with me on LinkedIn. Based on your feedback, I’m launching a new series: 100 Days of Snowflake Tips! Each post will be a quick read — no more than 3 minutes.

Today is Day 1, and I’ll be sharing insights about Snowpark-Optimized Warehouses

What is a Snowpark-Optimized Warehouse?

When creating a warehouse in Snowflake, you’ll notice an option for Snowpark-Optimized Warehouses.

These warehouses are specifically designed for memory-intensive workloads, offering:

  • 16x the memory
  • 10x the local cache

compared to standard warehouses.

This increased memory and caching power unlock advanced use cases, such as:

  • Machine learning (ML) training and inference
  • Large-scale data exports from object storage
  • Memory-intensive analytics

By choosing a Snowpark-Optimized Warehouse, you can efficiently handle large data sets and complex tasks that might otherwise exceed the capabilities of standard warehouses.

Fig 1 — Snowpark Optimized Warehouse

How to Enable a Snowpark-Optimized Warehouse in Snowflake?

When creating a warehouse in Snowflake, you’ll notice two options:

  • Standard
  • Snowpark-Optimized

To enable a Snowpark-Optimized Warehouse, simply select this option during the warehouse creation process. Once selected, you’ll see an additional setting called Resource Constraint (highlighted in red in the Fig 1).

What Does the Resource Constraint Do?

Resource constraints allow you to configure the memory and CPU architecture for Snowpark-optimized warehouses, ensuring the warehouse meets specific workload requirements.

As of now, the following resource constraints are available:

  • MEMORY_1X: Up to 16 GB of memory.
  • MEMORY_1X_86: Up to 16 GB of memory with x86 CPU architecture.
  • MEMORY_16X: Up to 256 GB of memory.
  • MEMORY_16X_86: Up to 256 GB of memory with x86 CPU architecture.
  • MEMORY_64X: Up to 1 TB of memory.
  • MEMORY_64X_86: Up to 1 TB of memory with x86 CPU architecture.

Key Considerations

  • XSMALL and SMALL Snowpark-optimized warehouses can select only MEMORY_1X and MEMORY_1X_86 as resource constraints.
  • MEDIUM Snowpark-optimized warehouses support MEMORY_1X, MEMORY_1X_86, MEMORY_16X, and MEMORY_16X_86.

From Large, you will able to select any Resource constraint, by choosing the right resource constraint, you can optimize your warehouse for specific use cases such as memory-intensive ML tasks or large-scale data processing.

Scripts for creation of Snowpark Optimized Warehouse:

CREATE OR REPLACE WAREHOUSE TEST_SNOWPARK_WAREHOUSE
WAREHOUSE_TYPE = 'SNOWPARK-OPTIMIZED' -- Ensures optimization for Snowpark workloads.
WAREHOUSE_SIZE = 'MEDIUM' -- Allocates resources suitable for moderate workloads (default is 'XSMALL').
RESOURCE_CONSTRAINT = 'MEMORY_16X_X86' -- Sets the memory capacity to 16x for in-memory intensive tasks.
MAX_CONCURRENCY_LEVEL = 1 -- Limits concurrency for optimal CPU and RAM utilization (default is 8).
AUTO_RESUME = TRUE -- Automatically resumes the warehouse when queried.
AUTO_SUSPEND = 60 -- Suspends the warehouse after 60 seconds of inactivity (default is 10 minutes).
INITIALLY_SUSPENDED = TRUE -- Starts the warehouse in a suspended state (default is active).
STATEMENT_TIMEOUT_IN_SECONDS = 240; -- Limits query execution time to 240 seconds (default is 2 days).

Stay tuned for more Snowflake insights!

About me:

I am a Cloud Data Architect at EY New Zealand. Over the course of my career, I have led and contributed to numerous projects, including legacy data warehouse modernization, big data implementations, cloud platform integrations, and migration initiatives. If you require assistance with certification, data solutions, or implementations, please feel free to connect with me on LinkedIn.

--

--

No responses yet