Skip to main content

Apache Airflow glossary

Whether you're new to Airflow or not, use this glossary to quickly reference key Airflow terms, components, and concepts.

TermDefinition
Airflow connectionAn Airflow connection is a set of configurations and credentials that allows Airflow to connect with external tools.
Airflow UIThe Airflow UI is the primary visual interface for managing DAG and task runs. It contains pages for modifying, monitoring, and troubleshooting an Airflow environment.
Airflow variableAn Airflow variable is a generic key value pair, such as an API key or file path, that's stored in the Airflow metadata database and that you can reference in a DAG.
Apache AirflowApache Airflow is an open source tool for programmatically authoring, scheduling, and monitoring data pipelines written in Python. Airflow is scalable, configurable, and the industry standard for managing workflows across your ecosystem.
Data orchestrationData orchestration is the automated configuration, scheduling, and management of sequential, interdependent tasks involving data. The complexity of modern data pipelines is reflected in the architecture and feature sets of orchestrators, which should not be confused with simple schedulers. In addition to scheduling, orchestration involves the handling of errors and dependencies between tasks, among other aspects of data pipeline management.
DatasetA dataset is a logical grouping of data consumed or produced by tasks in an Airflow DAG. It can be a table, a file, a blob, or a dataframe. Datasets can be used to schedule DAGs with dataset-driven scheduling.
DecoratorIn Python, decorators are functions that take another function as an argument and extend the behavior of that function. In Airflow, decorators provide a simpler way to define Airflow tasks and DAGs compared to traditional operators.
Deferrable operatorA deferrable operator, also known as an async operator, is an operator that suspends itself while waiting for its condition to be met and resumes on receiving the job status. Tasks that use deferrable operators consume resources more efficiently than sensors because they do not occupy a worker slot when they are in a deferred state. Instead, deferred tasks use the triggerer to poll for job statuses.
Docker imageAn Airflow Docker image is a template used to build the Docker containers which run Airflow components and execute DAG code. Both Apache Airflow and Astronomer distribute Docker images for Airflow with different build instructions and pre-installed packages.
Dynamic DAGA Dynamic DAG is a DAG that is generated automatically when the scheduler parses the dags folder. You can dynamically create DAGs based on code in one or more Python files or by using tools like gusty or dag-factory.
Dynamic taskA Dynamic task is a task instance that's generated at runtime based on a set of parameters in DAG code. Dynamic task mapping, the Airflow feature that creates dynamic tasks, allows users to create an arbitrary number of parallel tasks at runtime based on an input parameter.
Environment variableAn environment variable is a key-value pair that can be used to define an Airflow environment configuration. You typically set environment variables in the airflow.cfg file.
ExecutorAn executor is a core process within the Airflow scheduler that is responsible for assigning scheduled tasks to a worker process that will complete the function of a task. Airflow supports multiple executors that differ based on the types of workers they use.
HookA hook is an abstraction of a specific API that allows Airflow to interact with an external system. Hooks are built into many operators, but they can also be used directly in DAG code.
Jinja TemplateJinja templating is a format that is used to pass dynamic information into task instances at runtime. A jinja templated value is enclosed in double curly braces.
NotifierA notifier is a custom class that is pre-built into some provider packages and can be used to send notifications to tools like Slack or PagerDuty.
OperatorOperators are the building blocks of Airflow DAGs. An operator contains the logic of how data is processed in a pipeline. Each task in a DAG is defined by instantiating an operator.
ProviderAn Airflow provider is a Python package that can be added to core Airflow to extend its functionality. A provider package typically contains modules such as operators, hooks, and sensors to interact with an external service. You can add providers to your Airflow environment by adding their package names to the requirements.txt file of your Astro project. For a list of available providers, see the Astronomer registry.
SchedulerThe scheduler is the Airflow component responsible for scheduling job and task instances. It is a multi-threaded Python process that determines what tasks need to be run, when they need to be run, and where they are run.
SensorAn Airflow Sensor is a special kind of operator that is designed to wait for something to happen. When sensors run, they check to see if a certain condition is met before they are marked successful and let their downstream tasks execute.
TaskA task is the basic unit of execution in Airflow. Tasks are arranged into DAGs, and then have upstream and downstream dependencies set between them to express the order in which they should run.
Task dependencyA task dependency is an instruction that defines whether a task must be completed either before or after another task in the same DAG. Task dependencies are defined in DAG code either explicitly with bitshift operators or implicitly with the TaskFlow API.
Task groupA task group is a way to visually organize a group of tasks in the Airflow UI. Task groups are defined in DAG code and render as groupings in the Graph view of the Airflow UI.
TaskFlow APIThe TaskFlow API is a framework for using decorators to define DAGs and tasks. Compared to using traditional operators, using the TaskFlow API simplifies the process for passing data between tasks and defining dependencies.
TriggererThe triggerer is an optional Airflow component responsible for running deferrable operators when they're in a deferred state.
WebserverThe webserver is the Airflow component that serves the Airflow UI. It is a Flask server running with Gunicorn.
XComXCom is an Airflow feature that allows you to exchange task metadata or small amounts of data between tasks. XComs are defined by a key, value, and timestamp.

Was this page helpful?

Sign up for Developer Updates

Get a summary of new Astro features once a month.

You can unsubscribe at any time.
By proceeding you agree to our Privacy Policy, our Website Terms and to receive emails from Astronomer.