Airflow is randomly not running queued tasks some tasks dont even get queued status. I keep seeing below in the scheduler logs
[2018-02-28 02:24:58,780] {jobs.py:1077} INFO - No tasks to consider for execution.
I do see tasks in database that either have no status or queued status but they never get started.
The airflow setup is running https://github.com/puckel/docker-airflow on ECS with Redis. There are 4 scheduler threads and 4 Celery worker tasks. For the tasks that are not running are showing in queued state (grey icon) when hovering over the task icon operator is null and task details says:
All dependencies are met but the task instance is not running. In most cases this just means that the task will probably be scheduled soon unless:- The scheduler is down or under heavy load
Metrics on scheduler do not show heavy load. The dag is very simple with 2 independent tasks only dependent on last run. There are also tasks in the same dag that are stuck with no status (white icon).
Interesting thing to notice is when I restart the scheduler tasks change to running state.
Airflow can be a bit tricky to setup.
airflow scheduler
running?airflow webserver
running? I've had for instance a DAG which was wrongly set to depends_on_past: True
which forbid the current instance to start correctly.
Also a great resource directly in the docs, which has a few more hints: Why isn't my task getting scheduled?.