如何让logstash处理在Upstart下运行的多个日志文件

时间:2017-09-02 02:14:35

标签: logstash

我正在尝试将nginx和tomcat中的日志从同一个框发送到我们的aws弹性搜索内容,但是当通过upstart启动logstash时,我无法使用logstash来处理这两个日志。我通过upstart运行logstash时只处理nginx日志。

我必须遗漏一些简单的东西。

我在aws ec2中运行amz linux的相当香草的linux盒子上运行logstash。我已经通过rpm repo安装了logstash:

[logstash-5.x]
name=Elastic repository for 5.x packages
baseurl=https://artifacts.elastic.co/packages/5.x/yum
gpgcheck=1
gpgkey=https://artifacts.elastic.co/GPG-KEY-elasticsearch
enabled=1
autorefresh=1
type=rpm-md

操作系统:

$ uname -a
Linux ip-n-n-n-n 4.9.43-17.38.amzn1.x86_64 #1 SMP Thu Aug 17 00:20:39 UTC 2017 x86_64 x86_64 x86_64 GNU/Linux

Logstash版本:

# /usr/share/logstash/bin/logstash --version
logstash 5.5.2

我还为亚马逊安装了logstash插件:

sudo /usr/share/logstash/bin/logstash-plugin install logstash-output-amazon_es

我的logstash配置非常简单:

$ ls /etc/logstash/conf.d/
nginx-logstash.conf  tomcat-app-logstash.conf

$ cat /etc/logstash/conf.d/nginx-logstash.conf 
input {
  file {
    codec => json
    path => "/var/log/nginx/access.log"
  }
}

output {
    amazon_es {
      hosts => ["<domain-id>.us-east-2.es.amazonaws.com"]
      region => "us-east-2"
      index => "environment-logs"
    }
    file {
        path => "/tmp/logstashoutput"
    }
}

$cat /etc/logstash/conf.d/tomcat-app-logstash.conf 
input {
  file {
    codec => json
    path => "/var/log/tomcat8/tomcat-app.out"
  }
}

output {
    amazon_es {
      hosts => ["<domain-id>.us-east-2.es.amazonaws.com"]
      region => "us-east-2"
      index => "environment-logs"
    }
    file {
        path => "/tmp/logstashoutput"
    }
}

当我通过upstart initctl start logstash运行logstash时,我只处理了nginx日志,我通过查看kibana以及tailing / tmp / logstashoutput来验证。

# tail -f /var/log/logstash/logstash-plain.log 

...

[2017-09-01T22:55:21,250][WARN ][logstash.runner          ] SIGTERM received. Shutting down the agent.
[2017-09-01T22:55:21,258][WARN ][logstash.agent           ] stopping pipeline {:id=>"main"}
[2017-09-01T22:55:52,433][INFO ][logstash.outputs.amazones] Automatic template management enabled {:manage_template=>"true"}
[2017-09-01T22:55:52,484][INFO ][logstash.outputs.amazones] Using mapping template {:template=>{"template"=>"logstash-*", "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"_all"=>{"enabled"=>true, "omit_norms"=>true}, "dynamic_templates"=>[{"message_field"=>{"match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"string", "index"=>"analyzed", "omit_norms"=>true}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"string", "index"=>"analyzed", "omit_norms"=>true, "fields"=>{"raw"=>{"type"=>"string", "index"=>"not_analyzed", "ignore_above"=>256}}}}}], "properties"=>{"@version"=>{"type"=>"string", "index"=>"not_analyzed"}, "geoip"=>{"type"=>"object", "dynamic"=>true, "properties"=>{"location"=>{"type"=>"geo_point"}}}}}}}}
[2017-09-01T22:55:53,664][INFO ][logstash.outputs.amazones] New Elasticsearch output {:hosts=>["<domain-id>.es.amazonaws.com"], :port=>443}
[2017-09-01T22:55:53,710][INFO ][logstash.outputs.amazones] Automatic template management enabled {:manage_template=>"true"}
[2017-09-01T22:55:53,714][INFO ][logstash.outputs.amazones] Using mapping template {:template=>{"template"=>"logstash-*", "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"_all"=>{"enabled"=>true, "omit_norms"=>true}, "dynamic_templates"=>[{"message_field"=>{"match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"string", "index"=>"analyzed", "omit_norms"=>true}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"string", "index"=>"analyzed", "omit_norms"=>true, "fields"=>{"raw"=>{"type"=>"string", "index"=>"not_analyzed", "ignore_above"=>256}}}}}], "properties"=>{"@version"=>{"type"=>"string", "index"=>"not_analyzed"}, "geoip"=>{"type"=>"object", "dynamic"=>true, "properties"=>{"location"=>{"type"=>"geo_point"}}}}}}}}
[2017-09-01T22:55:53,792][INFO ][logstash.outputs.amazones] New Elasticsearch output {:hosts=>["<domain-id>.es.amazonaws.com"], :port=>443}
[2017-09-01T22:55:53,808][INFO ][logstash.pipeline        ] Starting pipeline {"id"=>"main", "pipeline.workers"=>1, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>5, "pipeline.max_inflight"=>125}
[2017-09-01T22:55:54,292][INFO ][logstash.pipeline        ] Pipeline main started
[2017-09-01T22:55:54,376][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
[2017-09-01T22:56:20,126][INFO ][logstash.outputs.file    ] Opening file {:path=>"/tmp/logstashoutput"}
[2017-09-01T22:56:20,186][INFO ][logstash.outputs.file    ] Opening file {:path=>"/tmp/logstashoutput"}

当我通过logstash用户执行的普通进程运行logstash时,我同时获得nginx日志以及处理的tomcat日志,我通过查看kibana以及tailing / tmp / logstashoutput来验证。

我这样做是为了让logstash用户在/bin/bash中使用shell /etc/passwd,然后运行sudo su - logstash

$ /usr/share/logstash/bin/logstash -f /etc/logstash/conf.d/
WARNING: Could not find logstash.yml which is typically located in $LS_HOME/config or /etc/logstash. You can specify the path using --path.settings. Continuing using the defaults
Could not find log4j2 configuration at path //usr/share/logstash/config/log4j2.properties. Using default config which logs to console
22:22:29.148 [main] INFO  logstash.setting.writabledirectory - Creating directory {:setting=>"path.queue", :path=>"/usr/share/logstash/data/queue"}
22:22:29.150 [main] INFO  logstash.setting.writabledirectory - Creating directory {:setting=>"path.dead_letter_queue", :path=>"/usr/share/logstash/data/dead_letter_queue"}
22:22:29.200 [LogStash::Runner] INFO  logstash.agent - No persistent UUID file found. Generating new UUID {:uuid=>"c0c03475-cea7-4739-8bf8-cc4ee639f97e", :path=>"/usr/share/logstash/data/uuid"}
22:22:30.048 [[main]-pipeline-manager] INFO  logstash.outputs.amazones - Automatic template management enabled {:manage_template=>"true"}
22:22:30.357 [[main]-pipeline-manager] INFO  logstash.outputs.amazones - Using mapping template {:template=>{"template"=>"logstash-*", "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"_all"=>{"enabled"=>true, "omit_norms"=>true}, "dynamic_templates"=>[{"message_field"=>{"match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"string", "index"=>"analyzed", "omit_norms"=>true}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"string", "index"=>"analyzed", "omit_norms"=>true, "fields"=>{"raw"=>{"type"=>"string", "index"=>"not_analyzed", "ignore_above"=>256}}}}}], "properties"=>{"@version"=>{"type"=>"string", "index"=>"not_analyzed"}, "geoip"=>{"type"=>"object", "dynamic"=>true, "properties"=>{"location"=>{"type"=>"geo_point"}}}}}}}}
22:22:31.419 [[main]-pipeline-manager] INFO  logstash.outputs.amazones - New Elasticsearch output {:hosts=>["<domain-id>.es.amazonaws.com"], :port=>443}
22:22:31.452 [[main]-pipeline-manager] INFO  logstash.outputs.amazones - Automatic template management enabled {:manage_template=>"true"}
22:22:31.453 [[main]-pipeline-manager] INFO  logstash.outputs.amazones - Using mapping template {:template=>{"template"=>"logstash-*", "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"_all"=>{"enabled"=>true, "omit_norms"=>true}, "dynamic_templates"=>[{"message_field"=>{"match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"string", "index"=>"analyzed", "omit_norms"=>true}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"string", "index"=>"analyzed", "omit_norms"=>true, "fields"=>{"raw"=>{"type"=>"string", "index"=>"not_analyzed", "ignore_above"=>256}}}}}], "properties"=>{"@version"=>{"type"=>"string", "index"=>"not_analyzed"}, "geoip"=>{"type"=>"object", "dynamic"=>true, "properties"=>{"location"=>{"type"=>"geo_point"}}}}}}}}
22:22:31.507 [[main]-pipeline-manager] INFO  logstash.outputs.amazones - New Elasticsearch output {:hosts=>["<domain-id>.us-east-2.es.amazonaws.com"], :port=>443}
22:22:31.522 [[main]-pipeline-manager] INFO  logstash.pipeline - Starting pipeline {"id"=>"main", "pipeline.workers"=>1, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>5, "pipeline.max_inflight"=>125}
22:22:31.907 [[main]-pipeline-manager] INFO  logstash.pipeline - Pipeline main started
22:22:32.002 [Api Webserver] INFO  logstash.agent - Successfully started Logstash API endpoint {:port=>9600}
22:22:49.214 [[main]>worker0] INFO  logstash.outputs.file - Opening file {:path=>"/tmp/logstashoutput"}
22:22:49.245 [[main]>worker0] INFO  logstash.outputs.file - Opening file {:path=>"/tmp/logstashoutput"}

当我通过upstart运行logstash时,我希望处理nginx和tomcat日志并将其发送到输出,这不会发生。

如果通过upstart运行logstash与通过logstash用户以普通进程运行它之间存在配置差异,我没有看到它。

如果存在环境差异,我也没有看到。

谢谢! -neil

0 个答案:

没有答案
相关问题