logstash

时间:2016-10-10 13:01:00

标签: logstash kibana elastic-stack logstash-grok logstash-configuration

我已经在我的Windows机器上设置了Elk堆栈,其中包含以下内容:

Elasticserach
Logstash
Kibana

我的logstash.conf

input {
file {
path => "\bin\MylogFile.log"
start_position => "beginning"
}
}
output { 
elasticsearch { 
hosts => localhost:9200
}
}

MylogFile.log(Apache日志)

127.0.0.1 - frank [10/Oct/2000:13:55:36 -0700] "GET /apache_pb.gif HTTP/1.0" 200 2326 "http://www.example.com/start.html" "Mozilla/4.08 [en] (Win98; I ;Nav)"

当我运行 logstash.conf 时,它会在elasticsearch中创建以下索引:

 health  status   index                            
 yellow  open     logstash-2016.10.06

上面的索引为空,不会从我的日志文件中获取任何数据。请帮忙?我是Elk堆栈的新手。

当我使用:http://localhost:9200/logstash-2016.10.10?=pretty=true查询索引 logstash-2016.10.10 时。我得到以下内容:

"logstash-2016.10.10" : {
    "aliases" : { },
    "mappings" : {
      "_default_" : {
        "_all" : {
          "enabled" : true,
          "omit_norms" : true
        },
        "dynamic_templates" : [ {
          "message_field" : {
            "mapping" : {
              "index" : "analyzed",
              "omit_norms" : true,
              "fielddata" : {
                "format" : "disabled"
              },
              "type" : "string"
            },
            "match" : "message",
            "match_mapping_type" : "string"
          }
        }, {
          "string_fields" : {
            "mapping" : {
              "index" : "analyzed",
              "omit_norms" : true,
              "fielddata" : {
                "format" : "disabled"
              },
              "type" : "string",
              "fields" : {
                "raw" : {
                  "index" : "not_analyzed",
                  "ignore_above" : 256,
                  "type" : "string"
                }
              }
            },

1 个答案:

答案 0 :(得分:0)

尝试在logstash conf中添加以下行,并告知我们是否存在任何grokparsing失败...这意味着您在过滤器部分中使用的模式不正确..

output {
  stdout { codec => json }
  file { path => "C:/POC/output3.txt" }
}

如果您看到grok解析失败: 尝试在过滤器部分中进行通用表达式,并慢慢优化它以正确解析日志