接收使用SparkLauncher启动的spark作业的结果

时间:2017-01-03 13:24:20

标签: java apache-spark spark-launcher

我正在使用以下代码启动spark作业:

   public static void main(String[] args) throws InterruptedException, ExecutionException {
            Process sparkProcess;
            try {
                sparkProcess = new SparkLauncher()
                        .setSparkHome("C:\\spark-2.0.0-bin-hadoop2.7")
                        .setAppResource("hdfs://server:9000/inputs/test.jar")
                        .setMainClass("com.test.TestJob")
                        .setMaster("spark://server:6066") //rest URL of spark
                        .setVerbose(true)
                        .setDeployMode("cluster")
                        .addAppArgs("abc")
                        .launch();
            } catch (IOException e) {
                throw new RuntimeException(e);
            }

             ExecutorService executorService = Executors.newSingleThreadExecutor();
        Future<Integer> submit = executorService.submit(() -> {
            try (BufferedReader reader = new BufferedReader(new InputStreamReader(sparkProcess.getInputStream()))) {
                while (sparkProcess.isAlive()) {
                    try {
                        System.out.println("input stream line:" + reader.readLine());
                        Thread.sleep(1000);
                    } catch (IOException e) {
                        throw new RuntimeException(e);
                    }
                }
            }
            return sparkProcess.exitValue();
        });
        System.out.println("Exit value:" + submit.get());
        sparkProcess.waitFor();
}

我的test.jar文件代码如下:

public static void main(String[] args) {
        SparkSession spark = SparkSession.builder().appName("test").config("spark.executor.memory", "1g")
                .config("spark.executor.cores", 1).getOrCreate();
        JavaSparkContext context = new JavaSparkContext(spark.sparkContext());
        long count = context.parallelize(Arrays.asList(1, 2, 3, 4, 5)).count();
        System.out.println("Count:" + count); //want to retrieve this detail on launching application.
        spark.stop();
    }

我想在成功完成spark工作时启动app时检索计数。但是从InputStream sparkProcess开始,我一直在接收null 我在这里做错了什么?

0 个答案:

没有答案
相关问题