log4j2.properties调整为仅设置Spark的特定日志级别

时间:2018-06-02 06:36:11

标签: scala apache-spark log4j log4j2

我正在将我的工作项目的log4j更新为log4j2,并试图掌握两种API之间已经发生变化的语法。

我从他们的网站上提取了一个示例log4j2.properties,如下所示。由于Spark在INFO级别上超级嘈杂,所以我需要设置控制台appender来过滤来自'org.apache.spark'的WARN下面的日志。在旧的API中,这只是log4j.logger.org.apache.spark=WARN,但现在看起来并不那么简单。

任何建议都将不胜感激。

status = error
dest = err
name = PropertiesConfig

property.filename = target/rolling/rollingtest.log

filter.threshold.type = ThresholdFilter
filter.threshold.level = debug

appender.console.type = Console
appender.console.name = STDOUT
appender.console.layout.type = PatternLayout
appender.console.layout.pattern = %d{yyyy-MM-dd HH:mm:ss} %-5p %c{1}:%L - %m%n
appender.console.filter.threshold.type = ThresholdFilter
appender.console.filter.threshold.level = info

appender.rolling.type = RollingFile
appender.rolling.name = RollingFile
appender.rolling.fileName = ${filename}
appender.rolling.filePattern = target/rolling2/test1-%d{MM-dd-yy-HH-mm-ss}-%i.log.gz
appender.rolling.layout.type = PatternLayout
appender.rolling.layout.pattern = %d %p %C{1.} [%t] %m%n
appender.rolling.policies.type = Policies
appender.rolling.policies.time.type = TimeBasedTriggeringPolicy
appender.rolling.policies.time.interval = 2
appender.rolling.policies.time.modulate = true
appender.rolling.policies.size.type = SizeBasedTriggeringPolicy
appender.rolling.policies.size.size=10MB
appender.rolling.strategy.type = DefaultRolloverStrategy
appender.rolling.strategy.max = 5

logger.rolling.name = com.workplace.project
logger.rolling.level = info
logger.rolling.additivity = false
logger.rolling.appenderRef.rolling.ref = RollingFile

rootLogger.level = info
rootLogger.appenderRef.stdout.ref = STDOUT

1 个答案:

答案 0 :(得分:0)

尝试在配置文件中添加以下行 -

logger.org.apache.spark.name = org.apache.spark
logger.org.apache.spark.level = warn
logger.org.apache.spark.additivity = false
logger.org.apache.spark.appenderRef.rolling.ref = RollingFile
logger.org.apache.spark.appenderRef.stdout.ref = STDOUT