Logging

來自 Elasticsearch 的“mapper_parsing_exception”錯誤無法解析timestamp噸一世米和s噸一種米ptimestamp來自 %{COMBINEDAPACHELOG}

  • August 31, 2018

我已將 Logstash 配置為過濾httpd_access_log消息並了解與COMBINEDAPACHELOG. 但是,我收到如下錯誤:

[2017-02-10T15:37:39,361][WARN ][logstash.outputs.elasticsearch] Failed action. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"filebeats", :_type=>"logs", :_routing=>nil}, 2017-02-10T23:37:34.187Z perf-wuivcx02.hq.mycompany.com cdn.mycompany.com 192.168.222.60 - - [10/Feb/2017:15:37:30 -0800] "GET /client/asd-client-main.js HTTP/1.1" 200 221430 "http://perf.companysite.com/" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/56.0.2924.87 Safari/537.36"], :response=>{"index"=>{"_index"=>"filebeats", "_type"=>"logs", "_id"=>"AVoqY6qkpAiTDgWeyMHJ", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse [timestamp]", "caused_by"=>{"type"=>"number_format_exception", "reason"=>"For input string: \"10/Feb/2017:15:37:30 -0800\""}}}}}

這是我的 Logstash 過濾器配置:

filter {
 if [type] == "json" {
   json {
     source => "message"
   }
 }
 if [type] == "syslog" {
   grok {
     match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:\[%{POSINT:syslog_pid}\])?: %{GREEDYDATA:syslog_message}" }
     add_field => [ "received_at", "%{@timestamp}" ]
     add_field => [ "received_from", "%{host}" ]
   }
   date {
     match => [ "syslog_timestamp", "MMM  d HH:mm:ss", "MMM dd HH:mm:ss" ]
   }
 }
 if [type] == "httpd_access_log" {
   grok {
     match => { "message" => "%{URIHOST} %{COMBINEDAPACHELOG}" }
   }
   date {
     match => [ "timestamp", "MMM dd yyyy HH:mm:ss", "MMM  d yyyy HH:mm:ss", "ISO8601" ]
   }
 }
}

date函式適用於處理syslog類型消息,但不適用於httpd_access_log消息。有誰知道為什麼時間戳會導致httpd_access_log文件中的行無法在 Elasticsearch 中建立索引?

提前感謝您提供的任何幫助或建議!

這不是 100% 的過濾器問題,輸出只是症狀。以下是向您顯示此錯誤消息的關鍵部分。

[2017-02-10T15:37:39,361][WARN ][logstash.outputs.elasticsearch]

這告訴您失敗的外掛是 elasticsearch 輸出。

Failed action. {:status=>400, :action=>["index",

(為了清楚起見,剪掉了)那是在嘗試index對 ElasticSearch 執行操作。

"error"=>
 {"type"=>"mapper_parsing_exception",
  "reason"=>"failed to parse [timestamp]",
  "caused_by"=>
    {"type"=>"number_format_exception",
     "reason"=>"For input string: \"10/Feb/2017:15:37:30 -0800\""}
    }
  }

這裡發生的是timestamp索引中的欄位不接受您嘗試放入其中的字元串。number_format_exception它說ElasticSearch 期望非字元串作為輸入的事實。

Logstash 正在嘗試將字元串寫入該timestamp欄位。這表明該timestamp欄位實際上並未通過date {}過濾器。這表明if [type] == "httpd_access_log" {沒有捕捉到所有可能的實例timestamp,或者您的日期過濾器的模式沒有捕捉到這個。錯誤字元串已清除,但我不確定您的來源是否真的發出時間戳,例如:

10/Feb/2017:15:37:30 -0800

如果它真的像那樣進入管道,找出原因。

引用自:https://serverfault.com/questions/831947