Skip to main content
Version: 0.34

Configure an external secrets backend on Astronomer Software

Apache Airflow variables and connections often contain sensitive information about your external systems that should be kept secret in a secure, centralized location that complies with your organization's security requirements. While secret values of Airflow variables and connections are encrypted in the Airflow metadata database of every Deployment, Astronomer recommends integrating with a secrets backend tool.

Secrets backend tool integration benefits

Integrating a secrets backend tool with Astronomer Software allows you to:

  • Store Airflow variables and connections in a centralized location alongside secrets from other tools and systems used by your organization, including Kubernetes secrets, SSL certificates, and more.
  • Comply with internal security postures and policies that protect your organization.
  • Recover in the case of an incident.
  • Automatically pull Airflow variables and connections that are already stored in your secrets backend when you create a new Deployment instead of having to set them manually in the Airflow UI.

Astronomer Software integrates with the following secrets backend tools:

  • Hashicorp Vault
  • AWS Systems Manager Parameter Store
  • Google Cloud Secret Manager
  • Azure Key Vault

Secrets backend integrations are configured individually with each Astronomer Software Deployment.

info

If you enable a secrets backend on Astronomer Software, you can continue to define Airflow variables and connections either as environment variables or in the Airflow UI as needed. If you define variables and connections in the Airflow UI, they are stored as encrypted values in the Airflow metadata database.

Airflow checks for the value of an Airflow variable or connection in the following order:

  1. Secrets backend
  2. Environment variable
  3. The Airflow UI
tip

Setting Airflow connections with secrets requires knowledge of how to generate Airflow connection URIs. If you plan to store Airflow connections on your secrets backend, see the Apache Airflow documentation for guidance on how to generate a connection URI.

Setup

In this section, you'll learn how to use Hashicorp Vault as a secrets backend for both local development and on Astronomer Software. To do this, you will:

  • Create an AppRole in Vault which grants Astronomer minimal required permissions.
  • Write a test Airflow variable or connection as a secret to your Vault server.
  • Configure your Astro project to pull the secret from Vault.
  • Test the backend in a local environment.
  • Deploy your changes to Astronomer Software.

Prerequisites

If you do not already have a Vault server deployed but would like to test this feature, Astronomer recommends that you either:

Step 1: Create a Policy and AppRole in Vault

To use Vault as a secrets backend, Astronomer recommends configuring a Vault AppRole with a policy that grants only the minimum necessary permissions for Astronomer Software. To do this:

  1. Create a Vault policy with the following permissions:

    path "secret/data/variables/*" {
    capabilities = ["read", "list"]
    }

    path "secret/data/connections/*" {
    capabilities = ["read", "list"]
    }
  2. Create a Vault AppRole and attach the policy you just created to it.

  3. Retrieve the role-id and secret-id for your AppRole by running the following commands:

    vault read auth/approle/role/<your-approle>/role-id
    vault write -f auth/approle/role/<your-approle>/secret-id

    Save these values for Step 3.

Step 2: Write an Airflow variable or connection to Vault

To test whether your Vault server is set up properly, create a test Airflow variable or connection to store as a secret.

To store an Airflow variable in Vault as a secret, run the following Vault CLI command with your own values:

vault kv put secret/variables/<your-variable-key> value=<your-variable-value>

To store a connection in Vault as a secret, run the following Vault CLI command with your own values:

vault kv put secret/connections/<your-connection-id> conn_uri=<connection-type>://<connection-login>:<connection-password>@<connection-host>:5432

To confirm that your secret was written to Vault successfully, run:

# For variables
$ vault kv get secret/variables/<your-variable-key>
# For connections
$ vault kv get secret/connections/<your-connection-id>

Step 3: Set up Vault locally

In your Astro project, add the Hashicorp Airflow provider to your project by adding the following to your requirements.txt file:

apache-airflow-providers-hashicorp

Then, add the following environment variables to your Dockerfile:

# Make sure to replace `<your-approle-id>` and `<your-approle-secret>` with your own values.
ENV AIRFLOW__SECRETS__BACKEND=airflow.providers.hashicorp.secrets.vault.VaultBackend
ENV AIRFLOW__SECRETS__BACKEND_KWARGS={"connections_path": "connections", "variables_path": "variables", "config_path": null, "url": "http://host.docker.internal:8200", "auth_type": "approle", "role_id":"<your-approle-id>", "secret_id":"<your-approle-secret>"}

This tells Airflow to look for variable and connection information at the secret/variables/* and secret/connections/* paths in your Vault server. In the next step, you'll test this configuration in a local Airflow environment.

danger

If you want to deploy your project to a hosted Git repository before deploying to Astronomer Software, be sure to save <your-approle-id> and <your-approle-secret> securely. Astronomer recommends adding them to your project's .env file and specifying this file in .gitignore.

When you deploy to Astronomer Software in Step 4, you can set these values as secrets in the Software UI.

info

By default, Airflow uses "kv_engine_version": 2, but this secret was written using v1. You can change this to accommodate how you write and read your secrets.

For more information on the Airflow provider for Hashicorp Vault and how to further customize your integration, see the Apache Airflow documentation.

Step 4: Run an example DAG to test Vault locally

To test Vault, write a simple DAG which calls your test secret and add this DAG to your project's dags directory. For example, you can use the following DAG to print the value of a variable to your task logs:

from airflow import DAG
from airflow.hooks.base import BaseHook
from airflow.models import Variable
from airflow.operators.python import PythonOperator
from datetime import datetime

def print_var():
my_var = Variable.get("<your-variable-key>")
print(f'My variable is: {my_var}')

with DAG('example_secrets_dags', start_date=datetime(2022, 1, 1), schedule=None) as dag:

test_task = PythonOperator(
task_id='test-task',
python_callable=print_var,
)

Once you've added this DAG to your project:

  1. Run astro dev restart to push your changes to your local Airflow environment.

  2. In the Airflow UI (http://localhost:8080/admin/), trigger your new DAG.

  3. Click on test-task > View Logs. If you ran the example DAG above, you should see the contents of your secret in the task logs:

    {logging_mixin.py:109} INFO - My variable is: my-test-variable

Once you confirm that the setup was successful, you can delete this example DAG.

Step 5: Deploy on Astronomer Software

Once you've confirmed that the integration with Vault works locally, you can complete a similar set up with a Deployment on Astronomer Software.

  1. In the Software UI, add the same environment variables found in your Dockerfile to your Deployment environment variables. Specify AIRFLOW__SECRETS__BACKEND_KWARGS as secret to ensure that your Vault credentials are stored securely.
  2. In your Astro project, delete the environment variables from your Dockerfile.
  3. Deploy your changes to Astronomer Software.

Now, any Airflow variable or connection that you write to your Vault server can be successfully accessed and pulled by any DAG in your Deployment on Astronomer Software.

Was this page helpful?

Sign up for Developer Updates

Get a summary of new Astro features once a month.

You can unsubscribe at any time.
By proceeding you agree to our Privacy Policy, our Website Terms and to receive emails from Astronomer.