Google Cloud DataFlow作业尚不可用。

时间:2018-10-02 11:52:22

标签: google-cloud-platform google-cloud-dataflow airflow dataflow airflow-scheduler

我正在从Airflow运行Dataflow作业。我需要说我是Airflow的新手。数据流(从Airflow运行)已成功运行,但是我可以看到Airflow在获取作业状态方面存在一些问题,并且收到无限消息,例如:

  

Google Cloud DataFlow作业尚不可用。

将所有步骤添加到数据流之后,这里是日志(我将{projectID}和{jobID}放在了原来的位置):

[2018-10-01 13:00:13,987] {logging_mixin.py:95} INFO - [2018-10-01 13:00:13,987] {gcp_dataflow_hook.py:128} WARNING - b'INFO: Staging pipeline description to gs://my-project/staging'

[2018-10-01 13:00:13,987] {logging_mixin.py:95} INFO - [2018-10-01 13:00:13,987] {gcp_dataflow_hook.py:128} WARNING - b'Oct 01, 2018 1:00:13 PM org.apache.beam.runners.dataflow.DataflowRunner run'

[2018-10-01 13:00:13,988] {logging_mixin.py:95} INFO - [2018-10-01 13:00:13,988] {gcp_dataflow_hook.py:128} WARNING - b'INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-10-01_06_00_12-{jobID}?project={projectID}'

[2018-10-01 13:00:13,988] {logging_mixin.py:95} INFO - [2018-10-01 13:00:13,988] {gcp_dataflow_hook.py:128} WARNING - b'Oct 01, 2018 1:00:13 PM org.apache.beam.runners.dataflow.DataflowRunner run'

[2018-10-01 13:00:13,988] {logging_mixin.py:95} INFO - [2018-10-01 13:00:13,988] {gcp_dataflow_hook.py:128} WARNING - b"INFO: To cancel the job using the 'gcloud' tool, run:"

[2018-10-01 13:00:13,988] {logging_mixin.py:95} INFO - [2018-10-01 13:00:13,988] {gcp_dataflow_hook.py:128} WARNING - b'> gcloud dataflow jobs --project={projectID} cancel --region=us-central1 2018-10-01_06_00_12-{jobID}'

[2018-10-01 13:00:13,990] {logging_mixin.py:95} INFO - [2018-10-01 13:00:13,990] {discovery.py:267} INFO - URL being requested: GET https://www.googleapis.com/discovery/v1/apis/dataflow/v1b3/rest

[2018-10-01 13:00:14,417] {logging_mixin.py:95} INFO - [2018-10-01 13:00:14,417] {discovery.py:866} INFO - URL being requested: GET https://dataflow.googleapis.com/v1b3/projects/{projectID}/locations/us-central1/jobs?alt=json

[2018-10-01 13:00:14,593] {logging_mixin.py:95} INFO - [2018-10-01 13:00:14,593] {gcp_dataflow_hook.py:77} INFO - Google Cloud DataFlow job not available yet..

[2018-10-01 13:00:29,614] {logging_mixin.py:95} INFO - [2018-10-01 13:00:29,614] {discovery.py:866} INFO - URL being requested: GET https://dataflow.googleapis.com/v1b3/projects/{projectID}/locations/us-central1/jobs?alt=json

[2018-10-01 13:00:29,772] {logging_mixin.py:95} INFO - [2018-10-01 13:00:29,772] {gcp_dataflow_hook.py:77} INFO - Google Cloud DataFlow job not available yet..

[2018-10-01 13:00:44,790] {logging_mixin.py:95} INFO - [2018-10-01 13:00:44,790] {discovery.py:866} INFO - URL being requested: GET https://dataflow.googleapis.com/v1b3/projects/{projectID}/locations/us-central1/jobs?alt=json

[2018-10-01 13:00:44,937] {logging_mixin.py:95} INFO - [2018-10-01 13:00:44,937] {gcp_dataflow_hook.py:77} INFO - Google Cloud DataFlow job not available yet..

您知道是什么原因引起的吗?我找不到与此问题相关的任何解决方案。 我应该提供更多信息吗?

这是我在DAG中的任务:

# dataflow task
dataflow_t=DataFlowJavaOperator(
task_id='mydataflow',
jar='/lib/dataflow_test.jar',
gcp_conn_id='my_gcp_conn',
delegate_to='{service_account}@{projectID}.iam.gserviceaccount.com',
dag=dag)

和连接到default_args中DAG中数据流的选项:

'dataflow_default_options': {
     'project': '{projectID}',
     'stagingLocation': 'gs://my-project/staging'
    }

1 个答案:

答案 0 :(得分:3)

我遇到了同样的问题。我在DataflowPipelineOptions中创建了作业名称。 Airflow还会根据您提供的任务ID创建工作名称。

So there is conflict and airflow is not able to find the actual job name which 
you created via DataflowPipelineOptions.

您应该只从DataflowPipelineOptions中删除作业名称,即可使用。