elasticsearch - 使用logstash日期导入csv不会解析为datetime类型

时间:2017-06-14 11:03:27

标签: elasticsearch logstash logstash-grok

我正在尝试使用logstash将csv导入elasticsearch 我尝试过两种方式:

  1. 使用CSV
  2. 使用grok过滤器
  3. 1)对于下面的csv是我的logstash文件:

    input {
      file {
        path => "path_to_my_csv.csv"
        start_position => "beginning"
        sincedb_path => "/dev/null"
      }
    }
    filter {
      csv {
            separator => ","
            columns => ["col1","col2_datetime"]
      }
      mutate {convert => [ "col1", "float" ]}
      date {
            locale => "en"
            match => ["col2_datetime", "ISO8601"] // tried this one also - match => ["col2_datetime", "yyyy-MM-dd HH:mm:ss"]
            timezone => "Asia/Kolkata"
            target => "@timestamp" // tried this one also - target => "col2_datetime"
       }
    }
    output {
       elasticsearch {
         hosts => "http://localhost:9200"
         index => "my_collection"
    
      }
      stdout {}
    }
    

    2)使用grok过滤器:

    对于下面的grok过滤器是我的logstash文件

    input {
      file {
        path => "path_to_my_csv.csv"
        start_position => "beginning"
        sincedb_path => "/dev/null"
      }
    }
    filter {
      grok {
        match => { "message" => "(?<col1>(?:%{BASE10NUM})),(%{TIMESTAMP_ISO8601:col2_datetime})"}
        remove_field => [ "message" ]
      }
      date {
            match => ["col2_datetime", "yyyy-MM-dd HH:mm:ss"]
       }
    }
    output {
       elasticsearch {
         hosts => "http://localhost:9200"
         index => "my_collection_grok"
    
      }
      stdout {}
    }
    

    问题:

    因此,当我单独运行这两个文件时,我能够在elasticsearch中导入数据。但我的日期字段没有解析为datetime类型,而是保存为字符串,因此我无法运行日期过滤器。

    有人可以帮我弄清楚它为什么会发生。 我的弹性搜索版本是5.4.1。

    提前致谢

1 个答案:

答案 0 :(得分:0)

我对配置文件进行了2次更改。

1)删除列名col2_datetime

中的under_score

2)添加目标

以下是我的配置文件的样子......

vi logstash.conf

input {
  file {
    path => "/config-dir/path_to_my_csv.csv"
    start_position => "beginning"
    sincedb_path => "/dev/null"
  }
}
filter {
  csv {
        separator => ","
        columns => ["col1","col2"]
  }
  mutate {convert => [ "col1", "float" ]}
  date {
        locale => "en"
        match => ["col2",  "yyyy-MM-dd HH:mm:ss"]
        target => "col2"
   }
}
output {
   elasticsearch {
     hosts => "http://172.17.0.1:9200"
     index => "my_collection"

  }
  stdout {}
}

这是数据文件:

vi path_to_my_csv.csv

1234365,2016-12-02 19:00:52 
1234368,2016-12-02 15:02:02 
1234369,2016-12-02 15:02:07
相关问题