Skip to main content

Global environment variables

In Astro, certain environment variables have preset values that are required and should not be overridden by your organization. The following table provides information about each global environment variable set by Astronomer.

For information on setting your own environment variables, see Environment variables.

danger

The Cloud UI does not currently prevent you from setting these environment variables, but attempting to set them can result in unexpected behavior that can include access problems, missing task logs, and failed tasks.

If you need to set one of these variables for a particular use case, contact Astronomer support.

System environment variables

Environment VariableDescriptionValue
AIRFLOW__LOGGING__DAG_PROCESSOR_LOG_TARGETRoutes scheduler logs to stdoutstdout
AIRFLOW__LOGGING__REMOTE_LOGGINGEnables remote loggingTrue
AIRFLOW__LOGGING__REMOTE_BASE_LOG_FOLDERLocation of remote logging storagebaseLogFolder
AIRFLOW_CONN_ASTRO_S3_LOGGINGConnection URI for writing task logs to Astro's managed S3 bucket<Connection-URI>
AIRFLOW__LOGGING__ENCRYPT_S3_LOGSDetermines whether to use server-side encryption for S3 logsFalse
AIRFLOW__WEBSERVER__BASE_URLThe base URL of the Airflow UIhttps://${fullIngressHostname}
AIRFLOW__CORE__SQL_ALCHEMY_CONNThe SqlAlchemy connection string for the metadata databasedbConnSecret
AIRFLOW__WEBSERVER__UPDATE_FAB_PERMSDetermines whether to update FAB permissions on webserver startupTrue
AIRFLOW__WEBSERVER__ENABLE_PROXY_FIXDetermines whether to enable werkzeug ProxyFix middleware for reverse proxyTrue
AIRFLOW_CONN_AIRFLOW_DBThe connection ID for accessing the Airflow metadata databasedbConnSecret
AIRFLOW__CORE__FERNET_KEYThe secret key for saving connection passwords in the metadata databasefernetKeySecret
AIRFLOW__CORE__EXECUTORThe executor class that Airflow uses. Astro exclusively supports the Celery executorexecutor
AIRFLOW_HOMEThe home directory for an Astro projectusr/local/airflow
AIRFLOW__KUBERNETES__NAMESPACEThe Kubernetes namespace where Airflow workers are creatednamespace
AIRFLOW__CORE__HOSTNAME_CALLABLEPath to a callable, which resolves to the hostnameairflow.utils.net.get_host_ip_address
AIRFLOW__SCHEDULER__STATSD_ONDetermines whether Statsd is onTrue
AIRFLOW__SCHEDULER__STATSD_HOSTThe hostname for Statsdstatsd.Hostname
AIRFLOW__SCHEDULER__STATSD_PORTThe port for Statsd<statsd-port>
AIRFLOW__METRICS__STATSD_ONDetermines whether metrics are sent to StatsdTrue
AIRFLOW__METRICS__STATSD_HOSTThe hostname for sending metrics to Statsdstatsd.Hostname
AIRFLOW__METRICS__STATSD_PORTThe port for sending metrics to Statsd<statsd-metrics-port>
AIRFLOW__WEBSERVER__COOKIE_SECURESets a secure flag on server cookiesTrue
AIRFLOW__WEBSERVER__INSTANCE_NAMEShows the name of your Deployment in the Home view of the Airflow UI<Deployment-Name>
AIRFLOW__CELERY__WORKER_CONCURRENCYDetermines how many tasks each Celery worker can run at any given time and is the basis of worker auto-scaling logic<Max-Tasks-Per-Worker>
AIRFLOW__WEBSERVER__NAVBAR_COLORThe color of the main navigation bar in the Airflow UI#4a4466
AIRFLOW__WEBSERVER__EXPOSE_CONFIGExposes the Configuration tab of the Airflow UI and hides sensitive valuesNON-SENSITIVE-ONLY
AWS_SECRET_ACCESS_KEYThe key secret for accessing Astro's managed S3 bucket¹<s3-aws-access-key-secret>
OPENLINEAGE_URLThe URL for your Astro lineage backend. The destination for lineage data sent from external systems to the OpenLineage API.https://astro-<your-astro-base-domain>.datakin.com
OPENLINEAGE_API_KEYYour OpenLineage API key<your-lineage-api-key>
info

¹ The AWS_SECRET_ACCESS_KEY and AWS_ACCESS_KEY_ID environment variables are required only for Deployments running on AWS. For any Deployment running on an AWS cluster, they should not be overridden.

There are no restrictions with setting these values for Deployments running on GCP and Azure.