View logs
View task and component logs for your DAGs to troubleshoot your data pipelines and better understand the behavior of your tasks and their execution environment.
Log and message types
Scheduler logs describe the performance of the scheduler, which is responsible for scheduling and queueing task runs. For more information on configuring the scheduler on Astro, see Scheduler resources.
Triggerer logs describe the performance of the triggerer, the Airflow component responsible for running triggers and signaling tasks to resume when their conditions have been met. The triggerer is used exclusively for tasks that are run with deferrable operators.
Worker logs are generated by your workers as they execute tasks. These are the same task logs that you can access directly from the Airflow UI as described in View Airflow task logs.
Webserver logs relate to the health and performance of the Airflow UI.
Log Levels
Logs and messages might also be associated with one of the following log levels:
- Error: Emitted when a process fails or does not complete. For example, these logs might indicate a missing DAG file, an issue with your scheduler's connection to the Airflow database, or an irregularity with your scheduler's heartbeat.
- Warn: Emitted when Airflow detects an issue that may or may not be of concern but does not require immediate action. This often includes deprecation notices marked as
DeprecationWarning
. For example, Airflow might recommend that you upgrade your Deployment if there was a change to the Airflow database or task execution logic. - Info: Emitted frequently by Airflow to show that a standard scheduler process, such as DAG parsing, has started. These logs are frequent but can contain useful information. If you run dynamically generated DAGs, for example, these logs will show how many DAGs were created per DAG file and how long it took the scheduler to parse each of them.
View logs in the Cloud UI
You can access scheduler, triggerer, and task logs in the Cloud UI to find the past 24 hours of logs for any Deployment on its Logs page.
In the Cloud UI, select a Workspace and then a Deployment.
Click the Logs tab.
The maximum number of lines returned is 10,000, with 25 results displayed per page. If there are no logs available for a given Deployment, the following message appears:
No matching events have been recorded in the past 24 hours.
Typically, this indicates that the Deployment you selected does not currently have any DAGs running.
Filter options
You can use the following options to specify the types of logs or messages that you want to view.
String search: Enter a string, keyword, or phrase to find in your logs. You can also search with suffix wildcards by adding a
*
to your search query. For example,acti*
returns results that includeaction
andacting
. The string search does not include fuzzy matching, so misspelled strings or incomplete strings without a wildcard,*
, return zero results.Time range: Filter the logs displayed based on time.
Log type: Filter based on whether the log message is from a scheduler, worker, webserver, or trigger.
View Airflow task logs on Astro
Airflow task logs for both local Airflow environments and Deployments on Astro are available in the Airflow UI. Task logs can help you troubleshoot a specific task instance that failed or retried.
On Astro, Airflow task logs are stored in your cluster. On Amazon Web Services (AWS), they are stored in S3. On Google Cloud Platform (GCP), they are stored in Cloud Storage. On Azure, they are stored in Azure Blob Storage.
On clusters hosted in your own cloud, task logs are stored indefinitely. On clusters hosted in Astronomer's cloud, task logs are hosted for 90 days. The task log retention policy is not currently configurable.
- Access the Airflow UI. To access the Airflow UI for a Deployment, open the Deployment in the Cloud UI and click Open Airflow. To access the Airflow UI in a local environment, open a browser and go to
http://localhost:8080
. - Click a DAG.
- Click Graph.
- Click a task run.
- Click Instance Details.
- Click Log.
Access Airflow component logs locally
To show logs for your Airflow scheduler, webserver, or triggerer locally, run the following Astro CLI command:
astro dev logs
Once you run this command, the most recent logs for these components appear in your terminal window.
By default, running astro dev logs
shows logs for all Airflow components. To see logs only for a specific component, add any of the following flags to your command:
--scheduler
--webserver
--triggerer
To continue monitoring logs, run astro dev logs --follow
. The --follow
flag ensures that the latest logs continue to appear in your terminal window. For more information about this command, see CLI Command Reference.
Logs for the Airflow webserver, worker, and triggerer are not available for Deployments on Astro.
Export task logs to Datadog (AWS only)
Astro supports forwarding Airflow task logs to Datadog. You only need to enable Datadog once for each Astro cluster. After you enable Datadog, task logs from all Deployments in the cluster are exported.
- Create a new Datadog API key or copy an existing API key. See API and Application Keys.
- Identify the Astro cluster from which you want to forward task logs.
- Submit a request to Astronomer support with your Datadog API key, the name of your Astro cluster, and the Datadog Site where you want the logs forwarded.
Astro also supports exporting Airflow metrics to Datadog. See Export Airflow metrics to Datadog.