气流任务实例陷入重试模式

时间:2018-08-29 10:09:51

标签: airflow airflow-scheduler

We are running airflow version 1.9 in celery executor mode. Our task instances are stuck in retry mode. When the job fails, the task instance retries. After that it tries to run the task and then fall back to new retry time. 

第一状态:

Task is not ready for retry yet but will be retried automatically. Current date is 2018-08-28T03:46:53.101483 and task will be retried at 2018-08-28T03:47:25.463271.

一段时间后:

All dependencies are met but the task instance is not running. In most cases this just means that the task will probably be scheduled soon unless:
- The scheduler is down or under heavy load

If this task instance does not start soon please contact your Airflow administrator for assistance

一段时间后:它再次进入重试模式。

任务尚未准备好重试,但将自动重试。当前日期是2018-08-28T03:51:48.322424,任务将在2018-08-28T03:52:57.893430重试。

这一切发生在所有dag身上。我们创建了一个测试dag,并尝试获取调度程序和工作日志的日志。

from datetime import *
from airflow import DAG
from airflow.operators.python_operator import PythonOperator
from airflow.operators.bash_operator import BashOperator

default_args = {
    'owner': 'Pramiti',
    'depends_on_past': False,
    'retries': 3,
    'retry_delay': timedelta(minutes=1)
}

dag = DAG('airflow-examples.test_failed_dag_v2', description='Failed DAG',
          schedule_interval='*/10 * * * *',
          start_date=datetime(2018, 9, 7), default_args=default_args)

b = BashOperator(
    task_id="ls_command",
    bash_command="mdr",
    dag=dag
)

Task Logs

Scheduler Logs

Worker Logs

0 个答案:

没有答案