Effectively testing DAGs requires an understanding of their structure and their relationship to other code and data in your environment. In this guide, you'll learn about various types of DAG validation testing, unit testing, and where to find further information on data quality checks.
An introduction to Airflow logging.
Use the KubernetesPodOperator in Airflow to run tasks in Kubernetes Pods
Check the quality of your data using Airflow.
SQL check operators
Executing queries in Apache Airflow DAGs to ensure data quality.
Dynamically generate DAGs
In Airflow, DAGs are defined as Python code. Airflow executes all Python code in the dags_folder and loads any DAG objects that appear in globals(). The simplest way to create a DAG is to write it as a static Python file.
Datasets and data-aware scheduling
Using datasets to implement DAG dependencies and scheduling in Airflow.
How to use Airflow plugins.
Implement deferrable operators to save cost and resources with Airflow.