Edit me

As data gets bigger and faster, many organizations struggle to centralize and prepare unwieldy data sets for analytics.

Apache Airflow is a data workflow management system that allows engineers to schedule, deploy and monitor their own data pipes as DAGs (directed acyclic graphs). Built by developers, for developers, it’s based on the principle that ETL is best expressed in code.

Airflow orchestrates tasks between different databases, datalakes, APIs, etc.

Unlike proprietary tools that only work well within single systems or cloud services, Airflow lets you define (in code), schedule, and monitor workflows between any system to let you use your data however you want.

By running Apache Airflow on Astronomer, you get:

  • One-click deployment through the Astro CLI
  • Access to our Airflow contributions which includes new hooks and operators
  • Serverless Airflow worker scalability through our SaaS platform
  • Access to our support documentation and team
  • Access to our Airflow development services team

Airflow 101

Getting Started with Apache Airflow on Astronomer

Other Resources