我正在处理一个ELK堆栈任务
在配置多个组件grok模式时,我收到日期格式的logstash配置错误。我的日期为21-Feb-2023 07:30:55.000(在component3 if块中),我在日期过滤器中使用dd-MMM-yyyy HH:mm:ss. SSS。由于我有多个组件,所以我使用if语句。(filebeat. yml负责这些设置。)
由于我使用不同的日期格式,我尝试使用逗号分隔。但它不工作!
input {
beats {
port => 5044
}
}
filter {
grok {
match => { "[log][file][path]" => "%{GREEDYDATA}/%{GREEDYDATA:filename}\.log" }
}
if [fields][component] == "component1" {
grok {
match => { "message" => "(?<context>.*)?t=%{TIMESTAMP_ISO8601:logTime} level=%{LOGLEVEL:logLevel} msg=%{GREEDYDATA:logMessage}" }
}
}
if [fields][component] == "component2" {
grok {
match => { "message" => "%{MONTH} %{NUMBER} %{TIME} %{HOSTNAME:host} influxd-systemd-start.sh\[%{NUMBER}\]: ts=%{TIMESTAMP_ISO8601:logTime} lvl=%{LOGLEVEL:logLevel} %{GREEDYDATA:logMessage}" }
}
}
if [fields][component] == "component3" {
grok {
pattern_definitions => {
"CUSTOMMONTH" => "(Jan|Feb|Mar|Apr|May|Jun|Jul|Aug|Sep|Oct|Nov|Dec)"
"CUSTOMTIMESTAMP" => "%{MONTHDAY}-%{CUSTOMMONTH}-%{YEAR} %{TIME}"
}
match => { "message" => "%{CUSTOMTIMESTAMP:logTime} %{LOGLEVEL:logLevel} \[%{DATA:thread}\] %{JAVACLASS:class} %{GREEDYDATA:logMessage}" }
}
}
date{
match => ["logTime", "yyyy-MM-dd HH:mm:ss","dd-MMM-yyyy:HH:mm:ss.SSS", "ISO8601"]
timezone => "XXX/AAA"
target => "@timestamp"
}
}
output {
elasticsearch {
hosts => ["localhost:9200"]
index => "logs-analytics_%{+YYYY.MM.dd}"
}
}
我收到的错误:
"failed to parse field [logTime] of type [date] in document with id 'hjfghfjhfkjhg'. Preview of field's value: '21-Feb-2023 07:30:55.000'", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"failed to parse date field [21-Feb-2023 07:30:55.000] with format [strict_date_optional_time||epoch_millis]", "caused_by"=>{"type"=>"date_time_parse_exception", "reason"=>"Failed to parse with all enclosed parsers"
有没有人能建议如何在grok模式日期过滤器中处理多个日期格式。
先谢了!
1条答案
按热度按时间apeeds0o1#
您提供了模式
dd-MMM-yyyy:HH:mm:ss.SSS
,但它应该是dd-MMM-yyyy HH:mm:ss.SSS
,以便分析错误日志(21-Feb-2023 07:30:55.000
)中报告的字段值