Skip to main content

Create a Snowflake Connection in Airflow

Snowflake is a cloud data warehouse where you can store and analyze your data. Integrating Snowflake with Airflow allows you to do all of the following and more from a DAG:

  • Run SQL
  • Monitor the status of SQL queries
  • Run a SnowPark python function
  • Load and export data to/from Snowflake

This guide provides the basic setup for creating a Snowflake connection. For a complete integration tutorial, see Orchestrate Snowflake Queries with Airflow. To run Snowpark queries in Airflow, see Run Snowpark queries with the ExternalPythonOperator in Apache Airflow.

Prerequisites

Get Connection details

A connection from Airflow to Snowflake requires the following information:

  • Host
  • Account
  • Region
  • Role
  • Database
  • Warehouse
  • Username
  • Password

Complete the following steps to retrieve these values:

  1. Open Snowsight. Follow the Snowflake documentation to open the account selector at the end of the left nav. Hover over your account to see more details, then click the Copy URL icon to copy the account URL. The URL has a similar format tohttps://<account-identifier>.<region>.snowflakecomputing.com/. Copy <account-identifier> and <region> from the URL.

    Screenshot of the bottom of the left nav in Snowsight showing how to copy the account URL.

info

When you copy your region, you might have to additionally copy the cloud provider identifier after the region name for some GCP and some AWS regions. For example, if your account URL is https://ZS86751.europe-west4.gcp.snowflakecomputing.com, then your region will be europe-west4.gcp. See Account identifiers to learn more about Snowflake's account types and their identifiers.

  1. Click the user menu at the beginning of the left sidebar and copy the role you want Airflow to have in Snowflake. You can click Switch Role to see all the available roles.

    Screenshot of the user menu in Snowsight showing how to copy the role.

  2. Copy the name of your Warehouse. To see all available warehouses, open a new Worksheet and open the context selector menu in the content pane.

    Screenshot of the context selector menu in Snowsight showing how to copy the warehouse.

  1. Copy the names for your Database and Schema.
  2. Create a new user that Airflow can use to access Snowflake. Copy the username and password.

Create your connection

  1. Open your Astro project and add the following line to your requirements.txt file:

    apache-airflow-providers-snowflake

    This will install the Snowflake provider package, which makes the Snowflake connection type available in Airflow.

  2. Run astro dev restart to restart your local Airflow environment and apply your changes in requirements.txt.

  3. In the Airflow UI for your local Airflow environment, go to Admin > Connections. Click + to add a new connection, then select the connection type as Snowflake.

  4. Fill out the following connection fields using the information you retrieved from Get connection details:

    • Connection Id: Enter a name for the connection.
    • Schema: Enter your Schema.
    • Login: Enter your username.
    • Password: Enter your password.
    • Account: Enter your account-identifier.
    • Warehouse: Enter your Warehouse.
    • Database: Enter your Database.
    • Region: Enter your region.
    • Role: Enter your Role.
  5. Click Test. After the connection test succeeds, click Save.

    snowflake-connection-extra

How it works

Airflow uses the Snowflake connector python package to connect to Snowflake through the SnowflakeHook.

See also

Was this page helpful?

Sign up for Developer Updates

Get a summary of new Astro features once a month.

You can unsubscribe at any time.
By proceeding you agree to our Privacy Policy, our Website Terms and to receive emails from Astronomer.