如何使用logstash上传elasticsearch中的csv文件?

ee7vknir  于 2021-07-13  发布在  ElasticSearch
关注(0)|答案(1)|浏览(395)

这是我的logstash.conf文件数据

input {
  file {
    path => "/home/niteshb/*.csv"
    start_position => "beginning"
    sincedb_path => "/dev/null"
  }
}                

filter {
  csv {
    separator => ","
    columns => ["tenant_id","hierarchy_name","attribute_name","item_pk"]
  }
}                

output {
  elasticsearch {
    hosts  => "http://localhost:9200"
    index  => "plan_record"
  }
  stdout {}
}

为了运行它,我正在使用

bin/logstash -f logstash.conf

运行后会遇到异常,异常为

] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"Expected one of [A-Za-z0-9_-], [ \\t\\r\\n], \"#\", \"=>\" at line 9, column 6 (byte 138) after input {\n  file {\n    path => \"/home/niteshb/*.csv\"\n    start_position => \"beginning\"\n    sincedb_path => \"dev/NULL\"\n  }\n\n  filter {\n  csv", :backtrace=>["/home/niteshb/Music/logstash-7.12.0/logstash-core/lib/logstash/compiler.rb:32:in `compile_imperative'", "org/logstash/execution/AbstractPipelineExt.java:184:in `initialize'", "org/logstash/execution/JavaBasePipelineExt.java:69:in `initialize'", "/home/niteshb/Music/logstash-7.12.0/logstash-core/lib/logstash/java_pipeline.rb:47:in `initialize'", "/home/niteshb/Music/logstash-7.12.0/logstash-core/lib/logstash/pipeline_action/create.rb:52:in `execute'", "/home/niteshb/Music/logstash-7.12.0/logstash-core/lib/logstash/agent.rb:389:in `block in converge_state'"]}
[2021-04-26T18:15:40,601][INFO ][logstash.runner          ] Logstash shut down.
[2021-04-26T18:15:40,611][FATAL][org.logstash.Logstash    ] Logstash stopped processing because of an error: (SystemExit) exit
org.jruby.exceptions.SystemExit: (SystemExit) exit

有人能帮我解决这个问题吗?我使用的是linux系统,不知道出了什么问题?

twh00eeo

twh00eeo1#

正确格式化配置文件总是有帮助的。你只是缺少了结尾处的一个大括号 input 以及 filter 部分:

input {
  file {
    path => "/home/niteshb/*.csv"
    start_position => "beginning"
    sincedb_path => "/dev/null"
  }
}                  <--- add this

filter {
  csv {
    separator => ","
    columns => ["tenant_id","hierarchy_name","attribute_name","item_pk"]
  }
}                  <--- add this

output {
  elasticsearch {
    hosts  => "http://localhost:9200"
    index  => "plan_record"
  }
  stdout {}
}

相关问题