The Astro CLI is an open source command line for modern data orchestration. It is the easiest way to run Apache Airflow on your machine. By the end of this quickstart, you'll have an Airflow environment running locally with just a few commands. From there, you can start building your project with your own DAGs, dependencies, and tests.
Step 1: Install the Astro CLI
To use the Astro CLI on Mac, you must have:
To install the latest version of the Astro CLI, run the following command:
brew install astro
To use the Astro CLI on Windows, you must have:
- Docker Desktop for Windows.
- Docker Engine (v0.18.9 or higher).
- WSL enabled on your local machine.
- Windows 10 or Windows 11.
Go to the Releases page of the Astro CLI GitHub repository. Based on your desired CLI version and the CPU architecture of your machine, download one of the
.zipfiles available on this page.
For example, to install v1.0.0 of the Astro CLI on a Windows machine with an AMD 64 architecture, download
.zipfile does not automatically unzip, run the following command to unzip the executable:
tar -xvzf .\astrocli.tar.gz
Add the filepath for the directory containing
astro.exeas a PATH environment variable. For example, if
astro.exewas stored in
C:\Users\username\astro.exe, you would add
C:\Users\usernameas your PATH environment variable. To learn more about configuring the PATH environment variable, see Java documentation.
Restart your machine.
Step 2: Confirm the install
To confirm the CLI was installed properly, run the following CLI command:
If the installation was successful, you should see the following output:
% astro version
Astro CLI Version: 1.2.0
Step 3: Create an Astro project
To start developing locally, you first need to create an Astro project, which contains all of the files you need to run Apache Airflow locally.
To create a new Astro project:
Create a new directory for your Astro project:
Open the directory:
Initialize an Astro project in the directory:
astro dev init
This Astro CLI command generates all of the necessary files to run Airflow locally in your new directory. This includes dedicated folders for your DAG files, plugins, and dependencies.
Step 4: Run Airflow locally
To confirm that you successfully initialized an Astro project, run the following command from your project directory:
astro dev start
This command builds your project and spins up 4 Docker containers on your machine, each for a different Airflow component:
- Postgres: Airflow's metadata database
- Webserver: The Airflow component responsible for rendering the Airflow UI
- Scheduler: The Airflow component responsible for monitoring and triggering tasks
- Triggerer: The Airflow component responsible for running Triggers and signaling tasks to resume when their conditions have been met. The triggerer is used exclusively for tasks that are run with deferrable operators
Step 5: Access the Airflow UI
Once your project builds successfully, you can access the Airflow UI by going to
http://localhost:8080/ and logging in with
admin for both your username and password.
After logging in, you should see two example DAGs in the Airflow UI that correspond to two files in the
dags directory of your Astro project. These example DAGs are maintained by Astronomer and showcase Airflow features and best practices.
That's all it takes to run Airflow locally using the Astro CLI! From here, you can add new DAGs to your Astro project and start developing.
Once you install the CLI and have an Astro project running locally, there are a few different paths you can take:
- To write and test your own DAGs locally and configure your local Airflow environment, see Develop Project.
- To view logs, set up unit tests for your DAGs, or troubleshoot your local Airflow environment, see Test and troubleshoot locally.
- To deploy DAGs to an environment managed by Astronomer, see Deploy code. If you're not a customer but might be interested in Astro, reach out to us.
- To learn more about the available commands in the CLI, see the CLI command reference or run