logstash无法创建新索引

时间:2017-07-29 18:26:17

标签: elasticsearch logstash elastic-stack

我是ELK 5.5.1的新用户(Elasticsearch,Logstash,Kibana)。

我正在使用带有Ubuntu 16.4的ELK制作一个监控服务器。

我目前有两个数据来源,来自我的rooter的Ne​​tflow和来自我的服务器的Collectd。

默认情况下,Logstash中的所有数据都是完美的,并且在同一索引“logstash - %{YYYY.MM.DD}”中以弹性方式输出。

数据流工作正常,但Kibana无法将数据流映射到唯一索引中,因为某些字段的数据类型不同。

这就是为什么我尝试以两个不同的索引发送数据流。

从Kibana我安装了X-pack并设置了一个名为“logstash_internal”的新用户,其角色为“logstash_writer”,具有所有权限(Cluster Privileges => all,Index Privileges => *,Privileges => all)

我为Logstash制作了以下配置文件,以便将数据推送到两个新索引中:

input {
 udp {
  port => 25826
  buffer_size => 1452
  codec => collectd { }
 }
 udp {
  port => 1734
  codec => netflow {
   versions => [5, 9]
  }
  type => netflow
 }
}

output {
 if ( [type] == "netflow" ) {
  elasticsearch {
   hosts => ["localhost:9200"]
   user => logstash_internal
   password => logstashmdp
   index => "lg-OpenWrt-%{+YYYY.MM.dd}"
  }
 } else {
  elasticsearch {
   hosts => ["localhost:9200"]
   user => logstash_internal
   password => logstashmdp
   index => "lg-Monitor-%{+YYYY.MM.dd}"
  }
 }
}

但是Elasticsearch没有制作新索引。当我在这里查看:http://127.0.0.1:9200/_cat/indices?v和在Kibana的Timelion中,不再接收数据流并且索引“lg-OpenWrt - %{+ YYYY.MM.dd}”和“lg-Monitor - %{+ YYYY.MM.dd}“不存在。

health status index                             uuid                   pri rep docs.count docs.deleted store.size pri.store.size
yellow open   .monitoring-es-6-2017.07.29       LZMojNYGRDuDw4GCwzBF8w   1   1      14385          180     10.2mb         10.2mb
yellow open   logstash-2017.07.29               lCQ-WltYRpiLIeCGBi900A   5   1       3217            0        1mb            1mb
yellow open   .monitoring-kibana-6-2017.07.29   xYhmEpjwRLu0jTFa1N_ldA   1   1        762            0    438.9kb        438.9kb
yellow open   .monitoring-es-6-2017.07.28       DcwVvdwcSxatRtZUnwxURQ   1   1       7305          162        5mb            5mb
yellow open   .watcher-history-3-2017.07.29     uAq4UMt2QZqoATDx29N79Q   1   1        639            0      551kb          551kb
yellow open   .watcher-history-3-2017.07.28     kVhig4-VQrmN4apudXHd3A   1   1        455            0    515.8kb        515.8kb
yellow open   .triggered_watches                qJnmD7XdQOitkFHLOkjj_g   1   1          0            0     48.1kb         48.1kb
yellow open   .monitoring-logstash-6-2017.07.28 qQVLWxtWQd-ber_3-2UVRw   1   1        135            0    239.1kb        239.1kb
green  open   .security                         k-p9fCvjQjK_MpQ9Y85mfg   1   0          8            0     29.5kb         29.5kb
yellow open   .monitoring-logstash-6-2017.07.29 zFQSFH51QYKTlTcKmzdmow   1   1        378            0    336.4kb        336.4kb
yellow open   logstash-2017.07.28               hAieuJgwSi26nMS9t_zHZw   5   1       1071            0    366.6kb        366.6kb
yellow open   .monitoring-alerts-6              ROR1eoOZTqeVt0QC6aEZPg   1   1          1            0      6.2kb          6.2kb
yellow open   .monitoring-kibana-6-2017.07.28   IRYOmymfTniNraqLKyfleA   1   1        392            0    249.1kb        249.1kb
yellow open   .watches                          NBepeMe7Quuva1VtQXu4SA   1   1          4            0       20kb           20kb
yellow open   .kibana                           Y_hBJIPESReGwuWw-ekfbA   1   1          1            0      3.8kb          3.8kb

Logstash似乎没有返回有关我尝试创建的新索引的任何错误。这是使用 - debug 选项从日志返回的输出配置:

[2017-07-29T18:19:12,411][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@hosts = [//localhost:9200]
[2017-07-29T18:19:12,411][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@user = "logstash_internal"
[2017-07-29T18:19:12,411][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@password = <password>
[2017-07-29T18:19:12,411][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@index = "lg-OpenWrt-%{+YYYY.MM.dd}"
[2017-07-29T18:19:12,411][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@id = "0b84cfc6ccf44fd8b90581b227110ec49b8af127-5"
[2017-07-29T18:19:12,411][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@enable_metric = true
[2017-07-29T18:19:12,411][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@codec = <LogStash::Codecs::Plain id=>"plain_6bd5a44e-3d5c-46f4-95e8-a461793f6ca3", enable_metric=>true, charset=>"UTF-8">
[2017-07-29T18:19:12,411][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@workers = 1
[2017-07-29T18:19:12,411][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@manage_template = true
[2017-07-29T18:19:12,412][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@template_name = "logstash"
[2017-07-29T18:19:12,412][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@template_overwrite = false
[2017-07-29T18:19:12,412][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@parent = nil
[2017-07-29T18:19:12,412][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@idle_flush_time = 1
[2017-07-29T18:19:12,412][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@upsert = ""
[2017-07-29T18:19:12,412][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@doc_as_upsert = false
[2017-07-29T18:19:12,412][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script = ""
[2017-07-29T18:19:12,412][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script_type = "inline"
[2017-07-29T18:19:12,412][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script_lang = "painless"
[2017-07-29T18:19:12,412][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script_var_name = "event"
[2017-07-29T18:19:12,412][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@scripted_upsert = false
[2017-07-29T18:19:12,412][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@retry_initial_interval = 2
[2017-07-29T18:19:12,413][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@retry_max_interval = 64
[2017-07-29T18:19:12,413][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@retry_on_conflict = 1
[2017-07-29T18:19:12,413][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@pipeline = nil
[2017-07-29T18:19:12,413][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@action = "index"
[2017-07-29T18:19:12,413][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@ssl_certificate_verification = true
[2017-07-29T18:19:12,413][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@sniffing = false
[2017-07-29T18:19:12,413][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@sniffing_delay = 5
[2017-07-29T18:19:12,413][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@timeout = 60
[2017-07-29T18:19:12,413][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@failure_type_logging_whitelist = []
[2017-07-29T18:19:12,413][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@pool_max = 1000
[2017-07-29T18:19:12,413][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@pool_max_per_route = 100
[2017-07-29T18:19:12,413][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@resurrect_delay = 5
[2017-07-29T18:19:12,413][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@validate_after_inactivity = 10000
[2017-07-29T18:19:12,413][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@http_compression = false
[2017-07-29T18:19:12,417][DEBUG][logstash.codecs.plain    ] config LogStash::Codecs::Plain/@id = "plain_5cb65ae2-45b5-415b-8cde-f80ef84d3b86"
[2017-07-29T18:19:12,417][DEBUG][logstash.codecs.plain    ] config LogStash::Codecs::Plain/@enable_metric = true
[2017-07-29T18:19:12,417][DEBUG][logstash.codecs.plain    ] config LogStash::Codecs::Plain/@charset = "UTF-8"
[2017-07-29T18:19:12,421][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@hosts = [//localhost:9200]
[2017-07-29T18:19:12,421][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@user = "logstash_internal"
[2017-07-29T18:19:12,421][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@password = <password>
[2017-07-29T18:19:12,421][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@index = "lg-Monitor-%{+YYYY.MM.dd}"
[2017-07-29T18:19:12,421][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@id = "0b84cfc6ccf44fd8b90581b227110ec49b8af127-6"
[2017-07-29T18:19:12,421][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@enable_metric = true
[2017-07-29T18:19:12,421][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@codec = <LogStash::Codecs::Plain id=>"plain_5cb65ae2-45b5-415b-8cde-f80ef84d3b86", enable_metric=>true, charset=>"UTF-8">
[2017-07-29T18:19:12,422][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@workers = 1
[2017-07-29T18:19:12,422][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@manage_template = true
[2017-07-29T18:19:12,422][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@template_name = "logstash"
[2017-07-29T18:19:12,422][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@template_overwrite = false
[2017-07-29T18:19:12,422][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@parent = nil
[2017-07-29T18:19:12,422][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@idle_flush_time = 1
[2017-07-29T18:19:12,422][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@upsert = ""
[2017-07-29T18:19:12,422][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@doc_as_upsert = false
[2017-07-29T18:19:12,422][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script = ""
[2017-07-29T18:19:12,423][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script_type = "inline"
[2017-07-29T18:19:12,423][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script_lang = "painless"
[2017-07-29T18:19:12,423][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script_var_name = "event"
[2017-07-29T18:19:12,423][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@scripted_upsert = false
[2017-07-29T18:19:12,423][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@retry_initial_interval = 2
[2017-07-29T18:19:12,423][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@retry_max_interval = 64
[2017-07-29T18:19:12,423][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@retry_on_conflict = 1
[2017-07-29T18:19:12,423][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@pipeline = nil
[2017-07-29T18:19:12,423][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@action = "index"
[2017-07-29T18:19:12,423][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@ssl_certificate_verification = true
[2017-07-29T18:19:12,423][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@sniffing = false
[2017-07-29T18:19:12,423][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@sniffing_delay = 5
[2017-07-29T18:19:12,423][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@timeout = 60
[2017-07-29T18:19:12,424][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@failure_type_logging_whitelist = []
[2017-07-29T18:19:12,424][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@pool_max = 1000
[2017-07-29T18:19:12,424][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@pool_max_per_route = 100
[2017-07-29T18:19:12,424][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@resurrect_delay = 5
[2017-07-29T18:19:12,424][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@validate_after_inactivity = 10000
[2017-07-29T18:19:12,424][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@http_compression = false
[2017-07-29T18:19:12,435][DEBUG][logstash.agent           ] starting agent
[2017-07-29T18:19:12,436][DEBUG][logstash.agent           ] starting pipeline {:id=>".monitoring-logstash"}
[2017-07-29T18:19:12,441][DEBUG][logstash.outputs.elasticsearch] Normalizing http path {:path=>nil, :normalized=>nil}
[2017-07-29T18:19:12,676][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://logstash_system:xxxxxx@localhost:9200/]}}
[2017-07-29T18:19:12,677][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://logstash_system:xxxxxx@localhost:9200/, :path=>"/"}
[2017-07-29T18:19:12,814][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>#<Java::JavaNet::URI:0x79d28495>}
[2017-07-29T18:19:12,816][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>[#<Java::JavaNet::URI:0x4e6b7e0f>]}
[2017-07-29T18:19:12,816][INFO ][logstash.pipeline        ] Starting pipeline {"id"=>".monitoring-logstash", "pipeline.workers"=>1, "pipeline.batch.size"=>2, "pipeline.batch.delay"=>5, "pipeline.max_inflight"=>2}
[2017-07-29T18:19:12,818][INFO ][logstash.pipeline        ] Pipeline .monitoring-logstash started
[2017-07-29T18:19:12,826][DEBUG][logstash.inputs.metrics  ] Metric: input started
[2017-07-29T18:19:12,826][DEBUG][logstash.agent           ] starting pipeline {:id=>"main"}
[2017-07-29T18:19:12,832][DEBUG][logstash.outputs.elasticsearch] Normalizing http path {:path=>nil, :normalized=>nil}
[2017-07-29T18:19:12,843][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://logstash_internal:xxxxxx@localhost:9200/]}}
[2017-07-29T18:19:12,843][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://logstash_internal:xxxxxx@localhost:9200/, :path=>"/"}
[2017-07-29T18:19:12,852][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>#<Java::JavaNet::URI:0x2af05878>}
[2017-07-29T18:19:12,855][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2017-07-29T18:19:12,912][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-*", "version"=>50001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"_all"=>{"enabled"=>true, "norms"=>false}, "dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date", "include_in_all"=>false}, "@version"=>{"type"=>"keyword", "include_in_all"=>false}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2017-07-29T18:19:12,918][DEBUG][logstash.outputs.elasticsearch] Found existing Elasticsearch template. Skipping template management {:name=>"logstash"}
[2017-07-29T18:19:12,918][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>[#<Java::JavaNet::URI:0x21cb14b7>]}
[2017-07-29T18:19:12,918][DEBUG][logstash.outputs.elasticsearch] Normalizing http path {:path=>nil, :normalized=>nil}
[2017-07-29T18:19:12,924][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://logstash_internal:xxxxxx@localhost:9200/]}}
[2017-07-29T18:19:12,924][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://logstash_internal:xxxxxx@localhost:9200/, :path=>"/"}
[2017-07-29T18:19:12,929][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>#<Java::JavaNet::URI:0x31bd1c88>}
[2017-07-29T18:19:12,931][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2017-07-29T18:19:12,937][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-*", "version"=>50001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"_all"=>{"enabled"=>true, "norms"=>false}, "dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date", "include_in_all"=>false}, "@version"=>{"type"=>"keyword", "include_in_all"=>false}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2017-07-29T18:19:12,942][DEBUG][logstash.outputs.elasticsearch] Found existing Elasticsearch template. Skipping template management {:name=>"logstash"}
[2017-07-29T18:19:12,943][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>[#<Java::JavaNet::URI:0x55d3c4f6>]}

有谁知道为什么Logstash没有制作这两个索引?

非常感谢您给我的任何建议。

0 个答案:

没有答案
相关问题