亚洲国产日韩欧美一区二区三区,精品亚洲国产成人av在线,国产99视频精品免视看7,99国产精品久久久久久久成人热,欧美日韩亚洲国产综合乱

php - How to solve data loss when logstash writes data to elasticsearch
歐陽克
歐陽克 2017-07-01 09:11:55
0
1
1188

There are more than 600 pieces of data in my log, but only more than 300 pieces are written to elasticsearch.
Does anyone know the reason for this?
This is my configuration
input {

file {
    path => ["/usr/local/20170730.log"]
    type => "log_test_events"
    tags => ["log_tes_events"]
    start_position => "beginning"
    sincedb_path => "/data/logstash/sincedb/test.sincedb"
    codec => "json"
    close_older => "86400"
    #1 day
    ignore_older => "86400"
}
beats{port => 5044}

}
filter {

urldecode {
    all_fields => true
}

}

output{

   elasticsearch {
      hosts  => "localhost:9200"
      index  => "logstash_%{event_date}"
}

stdout { codec => json }
}

歐陽克
歐陽克

溫故而知新,可以為師矣。 博客:www.ouyangke.com

reply all(1)
過去多啦不再A夢(mèng)

Because when reading the log, the ES template automatically creates a data type based on the format of the data. For example, the value of field a is int and string. The first index he creates reads a number, which is an int type index

Modify configuration and do mapping
output {

      elasticsearch {
        hosts  => "localhost:9200"
        index  => "test1"
        manage_template => true
        template_overwrite => true
        template => "/usr/local/logstash/templates/stat_day.json"
     }
     

stat_day.json template format

     {

"order" : 1,
"template" : "test1",
"mappings" : {

 "log_test": {
      "properties" : {
            "event_id": { "type": "string"}
       }
  }

}
}

Latest Downloads
More>
Web Effects
Website Source Code
Website Materials
Front End Template