Say you have an airflow DAG that doesn't make sense to backfill, meaning that, after it's run once, running it subsequent times quickly would be completely pointless.
For example, if you're loading data from some source that is only updated hourly into your database, backfilling, which occurs in rapid succession, would just be importing the same data again and again.
This is especially annoying when you instantiate a new hourly task, and it runs N
amount of times for each hour it missed, doing redundant work, before it starts running on the interval you specified.
The only solution I can think of is something that they specifically advised against in FAQ of the docs
We recommend against using dynamic values as start_date, especially
datetime.now()
as it can be quite confusing.
Is there any way to disable backfilling for a DAG, or should I do the above?
Upgrade to airflow version 1.8 and use catchup_by_default=False in the airflow.cfg or apply catchup=False to each of your dags.
https://github.com/apache/incubator-airflow/blob/master/UPDATING.md#catchup_by_default