fluent-plugin-elasticsearch:“无法将日志推送到Elasticsearch”错误,“error”=> {“type”=>“mapper_parsing_exception”}

时间:2017-04-09 15:03:33

标签: elasticsearch fluent

当我使用fluent-plugin-elasticsearch将Fluentd收集的数据注入Elasticsearch时,一些数据导致以下错误:

2017-04-09 23:47:37 +0900 [error]: Could not push log to Elasticsearch: {"took"=>3, "errors"=>true, "items"=>[{"index"=>{"_index"=>"logstash-201704", "_type"=>"ruby", "_id"=>"AVtTLz_cUzkwT9CQCxrH", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse [message]", "caused_by"=>{"type"=>"illegal_state_exception", "reason"=>"Can't get text on a START_OBJECT at 1:27"}}}}, .....]}

似乎elasticsearch禁止错误failed to parse [message]Can't get text on a START_OBJECT at 1:27的数据。但是我无法看到哪些数据被发送到Elasticsearch并且出了什么问题。

有什么想法吗?

1 个答案:

答案 0 :(得分:0)

fluent-plugin-elasticsearch使用_bulk API发送数据。我将请求转储代码放在/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/elasticsearch-api-5.0.4/lib/elasticsearch/api/actions/bulk.rb上,如下所示:

  def bulk(arguments={})
    ...
      payload = body
    end
    $log.info([method, path, params, payload].inspect)  # <=== here ($log is global logger of fluentd)
    perform_request(method, path, params, payload).body

我发现发送给Elasticsearch的请求如下:

POST /_bulk
{"index":{"_index":"logstash-201704","_type":"ruby"}}
{"level":"INFO","message":{"status":200,"time":{"total":46.26,"db":33.88,"view":12.38},"method":"PUT","path":"filtered","params":{"time":3815.904,"chapter_index":0},"response":[{}]},"node":"main","time":"2017-04-09T14:39:06UTC","tag":"filtered.console","@timestamp":"2017-04-09T23:39:06+09:00"}

问题是message字段包含JSON对象,尽管此字段在Elasticsearch上被映射为已分析的字符串。

相关问题