使用filebeat合并日期/时间之间的日志

时间:2016-10-15 05:36:58

标签: elasticsearch logstash logstash-grok filebeat

我正在尝试使用elasticSearch(No Logstash)将日志推送到fileBeat

我想发送以下登录单个消息,但它会被分成多个消息,每一行都成为单独的消息

20161014 17:49:09.169 [ERROR] [Thread-2974] some.java.class.:70 - some.java.Exception: write failed. History: [requestHost=123-some.org.com, time=Fri Oct 14 17:49:05 GMT-07:00 2016, exception=java.net.SocketTimeoutException]
[requestHost=123-some.org.com, time=Fri Oct 14 17:49:07 GMT-07:00 2016, exception=java.net.SocketTimeoutException]
[requestHost=123-some.org.com, time=Fri Oct 14 17:49:09 GMT-07:00 2016, exception=java.net.SocketTimeoutException]
 Tried 3 times
        at java.lang.Thread.run(Thread.java:745)
20161014 17:49:09.169 [ERROR] [Thread-3022]

我想合并2个日期(第1行和最后一行)之间的所有行

这是我的filebeat.yml代码段

 paths:
      - /test.log
      multiline.pattern: '^\[0-9]{8}'
      multiline.negate: true
      multiline.match: after

我需要知道正确regex

我试图在不使用logstash

的情况下解决这个问题

1 个答案:

答案 0 :(得分:2)

对提供的日志示例使用以下Filebeat配置会产生两个事件,其中每条消息都以日期开头。

我使用下面的配置运行./filebeat -c filebeat.yml -e -v -d "*"进行测试。我还在Go playground上测试了模式。

filebeat.yml:

filebeat:
  prospectors:
    - paths: ["input.txt"]
      multiline:
        pattern: '^[0-9]{8}'
        negate:  true
        match:   after
output:
  console:
    pretty: false

输出:

{   
  "@timestamp": "2016-10-17T14:13:31.292Z",
  "beat": {
    "hostname": "host.example.com",
    "name": "host.example.com",
  },  
  "input_type": "log",
  "message": "20161014 17:49:09.169 [ERROR] [Thread-2974] some.java.class.:70 - some.java.Exception: write failed. History: [requestHost=123-some.org.com, time=Fri Oct 14 17:49:05 GMT-07:00 2016, exception=java.net.SocketTimeoutException]\n[requestHost=123-some.org.com, time=Fri Oct 14 17:49:07 GMT-07:00 2016, exception=java.net.SocketTimeoutException]\n[requestHost=123-some.org.com, time=Fri Oct 14 17:49:09 GMT-07:00 2016, exception=java.net.SocketTimeoutException]\n Tried 3 times\n        at java.lang.Thread.run(Thread.java:745)",
  "offset": 519,
  "source": "input.txt",
  "type": "log"
}   
{   
  "@timestamp": "2016-10-17T14:17:21.686Z",
  "beat": {
    "hostname": "host.example.com",
    "name": "host.example.com",
  },  
  "input_type": "log",
  "message": "20161014 17:49:09.169 [ERROR] [Thread-3022]",
  "offset": 563,
  "source": "input.txt",
  "type": "log"
}