SparkContext错误 - 找不到文件/ tmp / spark-events不存在

时间:2016-07-13 11:20:05

标签: python amazon-web-services apache-spark amazon-ec2 pyspark

通过API调用运行Python Spark应用程序 - 提交申请 - 回复 - 失败 SSH进入Worker

我的python应用程序存在于

/root/spark/work/driver-id/wordcount.py

错误可以在

中找到
/root/spark/work/driver-id/stderr

显示以下错误 -

Traceback (most recent call last):
  File "/root/wordcount.py", line 34, in <module>
    main()
  File "/root/wordcount.py", line 18, in main
    sc = SparkContext(conf=conf)
  File "/root/spark/python/lib/pyspark.zip/pyspark/context.py", line 115, in __init__
  File "/root/spark/python/lib/pyspark.zip/pyspark/context.py", line 172, in _do_init
  File "/root/spark/python/lib/pyspark.zip/pyspark/context.py", line 235, in _initialize_context
  File "/root/spark/python/lib/py4j-0.9-src.zip/py4j/java_gateway.py", line 1064, in __call__
  File "/root/spark/python/lib/py4j-0.9-src.zip/py4j/protocol.py", line 308, in get_return_value
py4j.protocol.Py4JJavaError: An error occurred while calling None.org.apache.spark.api.java.JavaSparkContext.
: java.io.FileNotFoundException: File file:/tmp/spark-events does not exist.
  at org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:402)
  at org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFileSystem.java:255)
  at org.apache.spark.scheduler.EventLoggingListener.start(EventLoggingListener.scala:100)
  at org.apache.spark.SparkContext.<init>(SparkContext.scala:549)
  at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:59)
  at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
  at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
  at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
  at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
  at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:234)
  at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:381)
  at py4j.Gateway.invoke(Gateway.java:214)
  at py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:79)
  at py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:68)
  at py4j.GatewayConnection.run(GatewayConnection.java:209)
  at java.lang.Thread.run(Thread.java:745)

它表示 - / tmp / spark-events不存在 - 这是真的 但是,在wordcount.py

from pyspark import SparkContext, SparkConf

... few more lines ...

def main():
    conf = SparkConf().setAppName("MyApp").setMaster("spark://ec2-54-209-108-127.compute-1.amazonaws.com:7077")
    sc = SparkContext(conf=conf)
    sc.stop()

if __name__ == "__main__":
    main()

4 个答案:

答案 0 :(得分:21)

/tmp/spark-events是Spark存储事件日志的位置。只需在主机中创建此目录即可进行设置。

$mkdir /tmp/spark-events
$ sudo /root/spark-ec2/copy-dir /tmp/spark-events/
RSYNC'ing /tmp/spark-events to slaves...
ec2-54-175-163-32.compute-1.amazonaws.com

答案 1 :(得分:7)

尝试在我的本地计算机上设置我的spark历史记录服务器时,我有相同的&#39;文件文件:/ tmp / spark-events不存在。&#39;错误。我已将我的日志目录自定义为非默认路径。要解决这个问题,我需要做两件事。

  1. 编辑$ SPARK_HOME / conf / spark-defaults.conf - 添加这两行 spark.history.fs.logDirectory /mycustomdir spark.eventLog.enabled true
  2. 创建一个从/ tmp / spark-events到/ mycustomdir的链接 ln -fs /tmp/spark-events /mycustomdir 理想情况下,第1步将完全解决我的问题,但我仍然需要创建链接,所以我怀疑可能还有一个我错过的其他设置。无论如何,一旦我这样做,我就可以运行我的历史服务器,看到我的webui中记录了新的工作。

答案 2 :(得分:1)

将spark.eventLog.dir用于客户端/驱动程序

spark.eventLog.dir=/usr/local/spark/history

并使用spark.history.fs.logDirectory作为历史服务器

spark.history.fs.logDirectory=/usr/local/spark/history

如:How to enable spark-history server for standalone cluster non hdfs mode

中所述

至少按照Spark 2.2.1版

答案 3 :(得分:0)

我刚刚在{master}节点上创建了/tmp/spark-events,然后将其分发到群集上的其他节点才能工作。

mkdir /tmp/spark-events
rsync -a /tmp/spark-events {slaves}:/tmp/spark-events

我的spark-default.conf:

spark.history.ui.port=18080
spark.eventLog.enabled=true
spark.history.fs.logDirectory=hdfs:///home/elon/spark/events