Airflow server not running jobs

时间:2017-06-15 09:22:53

标签: airflow airflow-scheduler

My airflow server setup is not running tasks not even the example dags. Whenever I do a manual run a DagRun object is created for which status is running but it always stays the same. This problem is with all the dags and not just one particular dag.

Whenever I trigger the dag I can see it appear in the scheduler log but nothing appears in the celery log.

I am able to run the tasks inside a dag using airflow test command it's the airflow trigger or a manual trigger from that doesn't work.

I've ensured that all three of these commands are running, I've also put them under supervisor now.

  1. airflow webserver
  2. airflow scheduler
  3. airflow worker

Things that I've tried

  • I've tried changing the executor to LocalExecutor instead of celery executor that didn't help. but that
  • I am currently using redis for queuing with the setting like : broker_url = redis://myhostname.com:6379/10 and result backend setting celery_result_backend = amqp://guest:guest@localhost:5672. I've tried various combination of rabbit-mq and redis for these two setting but that didn't help
  • for redis I've tried using either the formats amqp:// and pyamqp:// for specifying broker url
  • I've tried changing the celery version, but that resulted in errors. Celery version that I'm using is celery==4.0.2

This is a setup running on Ubuntu 14.04.5 LTS, I've been able to run a local version of airflow successfully on my mac.

I've been stuck at it for weeks, can someone help me figure out / debug this problem?

0 个答案:

没有答案