独立群集模式的Spark历史记录

时间:2016-10-18 05:21:48

标签: apache-spark

我在Spark网站上看过这个文字。即使在应用程序结束或被杀之后,我也试图在UI上查看Spark日志。 无论如何我可以在独立模式下查看日志吗?

Spark在Mesos或YARN上运行,如果应用程序的事件日志存在,仍然可以通过Spark的历史服务器构建应用程序的UI。您可以通过执行以下命令启动历史记录服务器:

./sbin/start-history-server.sh
This creates a web interface at http://<server-url>:18080 by default, listing incomplete and completed applications and attempts.

When using the file-system provider class (see spark.history.provider below), the base logging directory must be supplied in the spark.history.fs.logDirectory configuration option, and should contain sub-directories that each represents an application’s event logs.

The spark jobs themselves must be configured to log events, and to log them to the same shared, writeable directory. For example, if the server was configured with a log directory of hdfs://namenode/shared/spark-logs, then the client-side options would be:

spark.eventLog.enabled true spark.eventLog.dir hdfs://namenode/shared/spark-logs

0 个答案:

没有答案