# Apache Airflow with Docker Compose example *UPGRADE UPPER 1.10.5 be AWARE:* You need to define 'default_pool' for task instances and set slots to it. About 1000, for example. This was not needed previous and default_poll was exist. But now you need to create it manually. So just go to UI, Admin -> Pools (http://localhost:8080/admin/pool/) and press *Create*. Create pool with name 'default_pool' and slots, for example 100 or 1000. Source files for article with description on Medium Apache Airflow with LocalExecutor: Apache Airflow with CeleryExecutor: Install Python dependencies to docker-compose cluster without re-build images ![Main Apache Airflow UI](/docs/img/main.png?raw=true "Main Apache Airflow UI") ![Version](/docs/img/version.png?raw=true "Version Screen") ### From 29.11: 1. Apache Airflow Image was updated to version 1.10.6 2. Added test_dag into airflow_files