如何确保flink作业已完成执行,然后执行某些任务

时间:2017-11-11 17:15:05

标签: apache-flink flink-streaming flink-cep

我想在flink作业完成后执行一些任务,我在Intellij中运行代码时没有任何问题,但是当我在shell文件中运行Flink jar时会有问题。我使用下面的行来确保flink程序的执行完成

//start the execution

JobExecutionResult jobExecutionResult = envrionment.execute(" Started the execution ");

 is_job_finished = jobExecutionResult.isJobExecutionResult();

我不确定,如果以上检查是否正确?

然后我使用上面的变量来执行某些任务

    if(print_mode && is_job_finished){



        System.out.println(" \n \n -- System related  variables  -- \n");

        System.out.println(" Stream_join Window length = " + WindowLength_join__ms + " milliseconds");
        System.out.println(" Input rate for stream RR  = " + input_rate_rr_S + " events/second");
        System.out.println("Stream RR Runtime = " + Stream_RR_RunTime_S + " seconds");
        System.out.println(" # raw events in stream RR  = " + Total_Number_Of_Events_in_RR + "\n");

}

有什么建议吗?

3 个答案:

答案 0 :(得分:0)

基本上,如果您在env.execute()调用之后要执行任何操作,它将在作业完成后执行。但是请注意,这仅包括以批处理模式运行的作业,流执行中没有这种可能性。

答案 1 :(得分:0)

您可以将作业侦听器注册到执行环境。

例如

env.registerJobListener(new JobListener {
      //Callback on job submission.
      override def onJobSubmitted(jobClient: JobClient, throwable: Throwable): Unit = {
        if (throwable == null) {
          log.info("SUBMIT SUCCESS")
        } else {
          log.info("FAIL")
        }
      }
    //Callback on job execution finished, successfully or unsuccessfully.
      override def onJobExecuted(jobExecutionResult: JobExecutionResult, throwable: Throwable): Unit = {

        if (throwable == null) {
          log.info("SUCCESS")
        } else {
          log.info("FAIL")
        }
      }
    })

答案 2 :(得分:0)

在您的StreamExecutionEnvironment中注册一个JobListener

相关问题