Skip to main content

Don't miss the biggest Airflow event of the year: Airflow Summit 2023, Sept 19-21

Join Us →

Configure an external secrets backend on Astro

Apache Airflow variables and connections often contain sensitive information about your external systems that should be kept secret in a secure, centralized location that complies with your organization's security requirements.

While secret values of Airflow variables and connections are encrypted in the Airflow metadata database of every Deployment, Astronomer recommends integrating with a secrets backend tool. This guide explains how to configure connections to various secrets backend tools on Astro.


If you only need a local connection to your cloud for testing purposes, consider mounting your user credentials to a local Airflow environment. While this implementation is not recommended for deployed environments, it lets you quickly test pipelines with data hosted in your cloud. See Authenticate to cloud services.


Integrating a secrets backend tool with Astro allows you to:

  • Store Airflow variables and connections in a centralized location alongside secrets from other tools and systems used by your organization, including Kubernetes secrets, SSL certificates, and more.
  • Comply with internal security postures and policies that protect your organization.
  • Recover in the case of an incident.
  • Automatically pull Airflow variables and connections that are already stored in your secrets backend when you create a new Deployment instead of having to set them manually in the Airflow UI.

Astro integrates with the following secrets backend tools:

  • Hashicorp Vault
  • AWS Systems Manager Parameter Store
  • AWS Secrets Manager
  • Google Cloud Secret Manager
  • Azure Key Vault

Secrets backend integrations are configured individually with each Astro Deployment.


If you enable a secrets backend on Astro, you can continue to define Airflow variables and connections either as environment variables or in the Airflow UI. If you set Airflow variables and connections in the Airflow UI, they are stored as encrypted values in the Airflow metadata database.

Airflow checks for the value of an Airflow variable or connection in the following order:

  1. Secrets backend
  2. Environment variable
  3. The Airflow UI

Using secrets to set Airflow connections requires knowledge of how to generate Airflow connection URIs. If you plan to store Airflow connections on your secrets backend, see URI format for guidance on how to generate a connection URI.


This topic provides setup steps for configuring AWS Secrets Manager as a secrets backend on Astro.

For more information about Airflow and AWS connections, see Amazon Web Services Connection.


Add Airflow secrets to Secrets Manager

Create directories for Airflow variables and connections in AWS Secrets Manager that you want to store as secrets. You can use real or test values.

  • When setting the secret type, choose Other type of secret and select the Plaintext option.
  • If creating a connection URI or a non-dict variable as a secret, remove the brackets and quotations that are pre-populated in the plaintext field.
  • The secret name is assigned after providing the plaintext value and clicking Next.

Secret names must correspond with the connections_prefix and variables_prefix set below in step 2. Specifically:

  • If you use "variables_prefix": "airflow/variables", you must set Airflow variable names as:

  • The <variable-key> is how you will retrieve that variable's value in a DAG. For example:

    my_var = Variable.get("variable-key>")
  • If you use "connections_prefix": "airflow/connections", you must set Airflow connections as:

  • The <connection-id> is how you will retrieve that connection's URI in a DAG. For example:

    conn = BaseHook.get_connection(conn_id="<connection-id>")
  • Be sure to not include a leading / at the beginning of your variable or connection name

For more information on adding secrets to Secrets Manager, see AWS documentation.

Set up Secrets Manager locally

Add the following environment variables to your Astro project's .env file:
AIRFLOW__SECRETS__BACKEND_KWARGS={"connections_prefix": "airflow/connections", "variables_prefix": "airflow/variables"}
AWS_ACCESS_KEY_ID=<Access Key> # Make sure the user has the permission to access secret manager

After you configure an Airflow connection to AWS, can run a DAG locally to check that your variables are accessible using Variable.get("<your-variable-key>").

Deploy environment variables to Astro

  1. Run the following commands to export your secrets backend configurations as environment variables to Astro.

    $ astro deployment variable create --deployment-id <your-deployment-id>

    $ astro deployment variable create --deployment-id <your-deployment-id> AIRFLOW__SECRETS__BACKEND_KWARGS='{"connections_prefix": "airflow/connections", "variables_prefix": "airflow/variables", "role_arn": "<your-role-arn>", "region_name": "<your-region>"}' --secret
  2. Optional. Remove the environment variables from your .env file or store your .env file in a safe location to protect your credentials.


    If you delete the .env file, the Secrets Manager backend won't work locally.

  3. Open the Airflow UI for your Deployment and create an Amazon Web Services connection without credentials. When you use this connection in a DAG, Airflow will automatically fall back to using the credentials in your configured environment variables.

To further customize the Airflow and AWS SSM Parameter Store integration, see the full list of available kwargs.

Was this page helpful?

Sign up for Developer Updates

Get a summary of new Astro features once a month.

You can unsubscribe at any time.
By proceeding you agree to our Privacy Policy, our Website Terms and to receive emails from Astronomer.