Airflow logging
An introduction to Airflow logging.
Data quality
Check the quality of your data using Airflow.
Datasets and data-aware scheduling
Using datasets to implement DAG dependencies and scheduling in Airflow.
Deferrable operators
Implement deferrable operators to save cost and resources with Airflow.
Dynamically generate DAGs
In Airflow, DAGs are defined as Python code. Airflow executes all Python code in the dags_folder and loads any DAG objects that appear in globals(). The simplest way to create a DAG is to write it as a static Python file.
KubernetesPodOperator
Use the KubernetesPodOperator in Airflow to run tasks in Kubernetes Pods
Plugins
How to use Airflow plugins.
Pools
Use pools to control Airflow task parallelism.
Setup/ teardown tasks
Learn how to use setup and teardown tasks to manage task resources in Airflow.
Test DAGs
Effectively testing DAGs requires an understanding of their structure and their relationship to other code and data in your environment. In this guide, you'll learn about various types of DAG validation testing, unit testing, and where to find further information on data quality checks.