Logstash条件消息中

时间:2019-01-30 08:10:17

标签: logstash logstash-grok logstash-configuration

我正在使用ELK泊坞窗堆栈来汇总和分析来自不同来源的日志,但是我的logstash配置存在问题。

Filebeat将流重定向到logstash,并且Elasticsearch上没有任何内容,因此我认为logstash配置中存在问题。

我的docker日志中有两种不同的日志:

  1. HTTP请求日志

    2019-01-29T18:35:15.423Z HTTP INFO“ POST / myroute /?param1 = test” 201 41-44.014 ms

  2. APP日志

    2019-01-29T18:48:19.657Z APP ERROR:{“ code”:201,“ message”:“ ok”}

当我检测到日志是“ APP”还是“ HTTP”时,我想对其进行检查并对其进行突变。所以,这里是我的logstash配置

input {
  beats {
    port => 5044
    codec => "json"
  }
}
filter {
    if "HTTP" in [message] {
      grok {
          mapping => { "message" => %{TIMESTAMP_ISO8601:timestamp} %{WORD:type} %{LOGLEVEL:level} "%{WORD:method} %{URIPATHPARAM:url}" %{INT:code} %{INT:bytes} - %{GREEDYDATA:response_time}
      }
    }
    else if "APP" in [message] {
      grok {
          mapping => { "message" => %{TIMESTAMP_ISO8601:timestamp} %{WORD:type} %{LOGLEVEL:level} %{GREEDYDATA:jsonstring}  }
      }
      json {
            source => "jsonstring"
            target => "doc"
      }
      mutate {
        add_field => {
          "code" => "%{[doc][code]}"
          "message" => "%{[doc][message]}"
        }
      }
    }
  }
}
output { 
    elasticsearch { 
        hosts => ["localhost"] 
    } 
}

我认为尝试检查邮件内容时出现问题,但我不知道如何解决。任何想法 ?

非常感谢您!

编辑:

我解决了我的配置中的一些问题,但仍然无法正常工作

input {
  beats {
    port => 5044
    codec => "json"
  }
}
filter {
    if [message] =~ /HTTP/  {
      grok {
          mapping => { "message" => %{TIMESTAMP_ISO8601:timestamp} %{WORD:type} %{LOGLEVEL:level} "%{WORD:method} %{URIPATHPARAM:url}" %{INT:code} %{INT:bytes} - %{GREEDYDATA:response_time}
      }
    }
    else if [message] =~ /APP/ {
      grok {
          mapping => { "message" => %{TIMESTAMP_ISO8601:timestamp} %{WORD:type} %{LOGLEVEL:level} %{GREEDYDATA:jsonstring}  }
      }
      json {
            source => "jsonstring"
            target => "doc"
      }
      mutate {
        add_field => {
          "code" => "%{[doc][code]}"
          "message" => "%{[doc][message]}"
        }
      }
    }
  }
}
output { 
    elasticsearch { 
        hosts => ["localhost"] 
    } 
}

编辑2:

logstash.stdout日志

 Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"Expected one of #, \", ', -, [, { at line 10, column 37 (byte 149) after filter {\n    if \"HTTP\" in [message] {\n      grok {\n          mapping => { \"message\" => ", :backtrace=>["/opt/logstash/logstash-core/lib/logstash/compiler.rb:41:in `compile_imperative'", "/opt/logstash/logstash-core/lib/logstash/compiler.rb:49:in `compile_graph'", "/opt/logstash/logstash-core/lib/logstash/compiler.rb:11:in `block in compile_sources'", "org/jruby/RubyArray.java:2486:in `map'", "/opt/logstash/logstash-core/lib/logstash/compiler.rb:10:in `compile_sources'", "org/logstash/execution/AbstractPipelineExt.java:149:in `initialize'", "/opt/logstash/logstash-core/lib/logstash/pipeline.rb:22:in `initialize'", "/opt/logstash/logstash-core/lib/logstash/pipeline.rb:90:in `initialize'", "/opt/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:42:in `block in execute'", "/opt/logstash/logstash-core/lib/logstash/agent.rb:92:in `block in exclusive'", "org/jruby/ext/thread/Mutex.java:148:in `synchronize'", "/opt/logstash/logstash-core/lib/logstash/agent.rb:92:in `exclusive'", "/opt/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:38:in `execute'", "/opt/logstash/logstash-core/lib/logstash/agent.rb:317:in `block in converge_state'"

没人吗? :(

1 个答案:

答案 0 :(得分:0)

您的配置在语法上是错误的,如消息Expected one of所示。首先,您缺少一个},然后在"之间将grok模式声明为一个String,其中转义了字符串中的"(作为\")和grok过滤器,该选项是匹配的,而不是映射。

因此,请从第一个conf开始更正所有问题:

input {
  beats {
    port => 5044
    codec => "json"
  }
}
filter {
    if "HTTP" in [message] {
        grok {
            match => { "message" => "%{TIMESTAMP_ISO8601:timestamp} %{WORD:type} %{LOGLEVEL:level} \"%{WORD:method} %{URIPATHPARAM:url}\" %{INT:code} %{INT:bytes} - %{GREEDYDATA:response_time}" }
        }
    } else if "APP" in [message] {
        grok {
            match => { "message" => "%{TIMESTAMP_ISO8601:timestamp} %{WORD:type} %{LOGLEVEL:level} %{GREEDYDATA:jsonstring}"  }
        }
        json {
            source => "jsonstring"
            target => "doc"
        }
        mutate {
            add_field => {
                "code" => "%{[doc][code]}"
                "message" => "%{[doc][message]}"
            }
        }
    }
}

output { 
    elasticsearch { 
        hosts => ["localhost"] 
    } 
}