Logstash只给我两个日志

时间:2015-02-17 18:10:56

标签: elasticsearch logstash

←[33m使用里程碑2输入插件'文件'。这个插件应该是稳定的,但如果 你看到奇怪的行为,请告诉我们!有关插件mi的更多信息 lestones,见http://logstash.net/docs/1.4.2/plugin-milestones {:level =>:warn}←[ 0米 ←[33m使用里程碑2过滤插件'csv'。这个插件应该是稳定的,但如果 你看到奇怪的行为,请告诉我们!有关插件mi的更多信息 lestones,见http://logstash.net/docs/1.4.2/plugin-milestones {:level =>:warn}←[ 0米

我的配置:

input {
file {
path => [ "e:\mycsvfile.csv" ]
start_position => "beginning"
}
}
filter {
csv {
columns => ["col1","col2"]
source => "csv_data"
separator => ","
}
}

output {
elasticsearch { 
host => localhost
port => 9200
index => test
index_type => test_type
protocol => http
}
stdout { 
codec => rubydebug
}
}

我的环境: Windows 8 logstash 1.4.2

问题:有没有人经历过这个? logstash日志在哪里? Windows上是否存在已知的logstash错误?我的经验是,logstash没有做任何事情。

我试过了:

logstash.bat agent -f test.conf --verbose
←[33mUsing milestone 2 input plugin 'file'. This plugin should be stable, but if
you see strange behavior, please let us know! For more information on plugin mi
lestones, see http://logstash.net/docs/1.4.2/plugin-milestones {:level=>:warn}←[
0m
←[33mUsing milestone 2 filter plugin 'csv'. This plugin should be stable, but if
you see strange behavior, please let us know! For more information on plugin mi
lestones, see http://logstash.net/docs/1.4.2/plugin-milestones {:level=>:warn}←[
0m
←[32mRegistering file input {:path=>["e:/temp.csv"], :level=>:info}←[0m
←[32mNo sincedb_path set, generating one based on the file path {:sincedb_path=>
"C:\Users\gemini/.sincedb_d8e46c18292a898ea0b5b1cd94987f21", :path=>["e:/tem
p.csv"], :level=>:info}←[0m
←[32mPipeline started {:level=>:info}←[0m
←[32mNew Elasticsearch output {:cluster=>nil, :host=>"localhost", :port=>9200, :
embedded=>false, :protocol=>"http", :level=>:info}←[0m
←[32mAutomatic template management enabled {:manage_template=>"true", :level=>:i
nfo}←[0m
←[32mUsing mapping template {:template=>"{ \"template\" : \"logstash-\", \"se
ttings\" : { \"index.refresh_interval\" : \"5s\" }, \"mappings\" : { \"_
default_\" : { \"_all\" : {\"enabled\" : true}, \"dynamic_templates\
" : [ { \"string_fields\" : { \"match\" : \"\", \"m
atch_mapping_type\" : \"string\", \"mapping\" : { \"type\"
: \"string\", \"index\" : \"analyzed\", \"omit_norms\" : true, \"
fields\" : { \"raw\" : {\"type\": \"string\", \"index\" : \"not_
analyzed\", \"ignore_above\" : 256} } } }
} ], \"properties\" : { \"@version\": { \"type\": \"string\", \"in
dex\": \"not_analyzed\" }, \"geoip\" : { \"type\" : \"object\
", \"dynamic\": true, \"path\": \"full\", \"
properties\" : { \"location\" : { \"type\" : \"geo_point\" }
} } } } }}", :level=>:info}←[0m

它暂时保持这种状态,并且在elasticsearch中没有创建新的索引。

2 个答案:

答案 0 :(得分:0)

我不得不补充道:

sincedb_path => "NIL"

它有效。

http://logstash.net/docs/1.1.0/inputs/file#setting_sincedb_path

  

sincedb_path值类型是字符串没有默认值   设置。在哪里写自己的数据库(跟踪当前   受监控日志文件的位置)。默认为环境值   变量" $ SINCEDB_PATH"或" $ HOME / .sincedb"。

我在C:\ users {user}中生成了几个sincedb文件。

答案 1 :(得分:0)

使用CSV作为输入数据时我必须添加:

sincedb_path =>文件里面的“NIL”{} json

示例:

input {
file {
    path => [ "C:/csvfilename.txt"]
    start_position => "beginning"
    sincedb_path => "NIL"
}

}

它适用于logstash 1.4.2版

相关问题