将Spark工作者日志写入stdout / stderr

时间:2019-06-03 11:01:49

标签: apache-spark logging cluster-computing stdout stderr

嗨,我正在尝试将Spark工作者日志重定向到stdout / stderr。

我添加了如下所示的自定义log4j.properties文件:

log4j.rootLogger = INFO, stdout, stderr
# configure stdout
log4j.appender.stdout = org.apache.log4j.ConsoleAppender
log4j.appender.stdout.Threshold = TRACE
log4j.appender.stdout.filter.filter1=org.apache.log4j.varia.LevelRangeFilter
log4j.appender.stdout.filter.filter1.levelMin = TRACE
log4j.appender.stdout.filter.filter1.levelMax = INFO
log4j.appender.stdout.Target = System.out
log4j.appender.stdout.layout = org.apache.log4j.PatternLayout
log4j.appender.stdout.layout.ConversionPattern = %d{yy/MM/dd HH:mm:ss,SSS} %p %c: %m%n
# configure stderr
log4j.appender.stderr = org.apache.log4j.ConsoleAppender
log4j.appender.stderr.Threshold = WARN
log4j.appender.stderr.Target = System.err
log4j.appender.stderr.layout = org.apache.log4j.PatternLayout
log4j.appender.stderr.layout.ConversionPattern = %d{yy/MM/dd HH:mm:ss,SSS} %p %c: %m%n
# Settings to quiet third party logs that are too verbose
log4j.logger.org.apache.hadoop.util.NativeCodeLoader = ERROR
log4j.logger.org.eclipse.jetty=WARN
log4j.logger.org.eclipse.jetty.util.component.AbstractLifeCycle=ERROR
log4j.logger.org.apache.spark.repl.SparkIMain$exprTyper=INFO
log4j.logger.org.apache.spark.repl.SparkILoop$SparkILoopInterpreter=INFO
log4j.logger.org.apache.spark=WARN
log4j.logger.org.spark-project.jetty.server=WARN

静态日志不会重定向到磁盘上的stdout / stderr,而是重定向到stdout和stderr文件。

任何人都知道如何使它工作吗?

编辑:从this blog post中读取内容似乎只是它的工作方式

0 个答案:

没有答案