无法执行Spark-Submit或Spark-shell

时间:2016-01-15 16:30:24

标签: apache-spark

每当我执行spark-submit或spark-shell或py-spark时,在我的Hadoop集群上

它继续无限地说......

16/01/15 16:27:50 INFO Client: Application report for 
application_1452870745977_0005 (state: ACCEPTED)
16/01/15 16:27:51 INFO Client: Application report for 
application_1452870745977_0005 (state: ACCEPTED)
16/01/15 16:27:52 INFO Client: Application report for 
application_1452870745977_0005 (state: ACCEPTED)
16/01/15 16:27:53 INFO Client: Application report for 
application_1452870745977_0005 (state: ACCEPTED)
16/01/15 16:27:54 INFO Client: Application report for 
application_1452870745977_0005 (state: ACCEPTED)
16/01/15 16:27:55 INFO Client: Application report for 
application_1452870745977_0005 (state: ACCEPTED)
16/01/15 16:27:56 INFO Client: Application report for 
application_1452870745977_0005 (state: ACCEPTED)
16/01/15 16:27:57 INFO Client: Application report for 
application_1452870745977_0005 (state: ACCEPTED)
16/01/15 16:27:58 INFO Client: Application report for 
application_1452870745977_0005 (state: ACCEPTED)
16/01/15 16:27:59 INFO Client: Application report for 
application_1452870745977_0005 (state: ACCEPTED)
16/01/15 16:28:00 INFO Client: Application report for 
application_1452870745977_0005 (state: ACCEPTED)
16/01/15 16:28:01 INFO Client: Application report for 
application_1452870745977_0005 (state: ACCEPTED)
16/01/15 16:28:02 INFO Client: Application report for 
application_1452870745977_0005 (state: ACCEPTED)

我等了很长时间,但这条消息并没有消失......有没有人知道火花壳出了什么问题?

1 个答案:

答案 0 :(得分:0)

这已经解决了。我的管理员告诉我,由于日志文件非常大,许多hadoop节点的本地磁盘空间都用完了。

当修剪日志文件并释放空间时,火花再次开始工作。