logstash 7.9.1 docker conatiner:文件输入不工作

rfbsl7qr  于 2021-06-14  发布在  ElasticSearch
关注(0)|答案(1)|浏览(601)

我试图读取一个日志文件,但不起作用,当logstash.conf配置为在端口5000中侦听,但从一个文件不起作用时,它起作用。我正在使用docker容器中的logstash版本7.9.1,并尝试将日志发送到elastic search 7.9.1。这是我的logstash.conf文件

  1. input {
  2. file {
  3. path => ["/home/douglas/projects/incollect/*.log"]
  4. start_position => "beginning"
  5. ignore_older => 0
  6. sincedb_path => "/dev/null"
  7. }
  8. }
  9. output {
  10. elasticsearch {
  11. hosts => "elasticsearch:9200"
  12. index => "test-elk-%{+YYYY.MM.dd}"
  13. user => "elastic"
  14. password => "changeme"
  15. }
  16. stdout {
  17. codec => rubydebug
  18. }
  19. }

这是来自控制台的日志,我看不到任何错误,并说已成功启动

  1. logstash_1 | [2020-10-16T00:38:27,748][INFO ][logstash.outputs.elasticsearch][main] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//elasticsearch:9200"]}
  2. logstash_1 | [2020-10-16T00:38:27,795][INFO ][logstash.outputs.elasticsearch][main] Using a default mapping template {:es_version=>7, :ecs_compatibility=>:disabled}
  3. logstash_1 | [2020-10-16T00:38:27,798][INFO ][logstash.javapipeline ][.monitoring-logstash] Starting pipeline {:pipeline_id=>".monitoring-logstash", "pipeline.workers"=>1, "pipeline.batch.size"=>2, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>2, "pipeline.sources"=>["monitoring pipeline"], :thread=>"#<Thread:0x44d5fe run>"}
  4. logstash_1 | [2020-10-16T00:38:27,800][INFO ][logstash.javapipeline ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>500, "pipeline.sources"=>["/usr/share/logstash/pipeline/logstash.conf"], :thread=>"#<Thread:0x4c6dee32 run>"}
  5. logstash_1 | [2020-10-16T00:38:27,840][INFO ][logstash.outputs.elasticsearch][main] Attempting to install template {:manage_template=>{"index_patterns"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s", "number_of_shards"=>1}, "mappings"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}
  6. logstash_1 | [2020-10-16T00:38:28,535][INFO ][logstash.javapipeline ][.monitoring-logstash] Pipeline Java execution initialization time {"seconds"=>0.73}
  7. logstash_1 | [2020-10-16T00:38:28,599][INFO ][logstash.javapipeline ][.monitoring-logstash] Pipeline started {"pipeline.id"=>".monitoring-logstash"}
  8. logstash_1 | [2020-10-16T00:38:28,600][INFO ][logstash.javapipeline ][main] Pipeline Java execution initialization time {"seconds"=>0.8}
  9. logstash_1 | [2020-10-16T00:38:28,840][INFO ][logstash.javapipeline ][main] Pipeline started {"pipeline.id"=>"main"}
  10. logstash_1 | [2020-10-16T00:38:28,909][INFO ][logstash.agent ] Pipelines running {:count=>2, :running_pipelines=>[:".monitoring-logstash", :main], :non_running_pipelines=>[]}
  11. logstash_1 | [2020-10-16T00:38:28,920][INFO ][filewatch.observingtail ][main][4a3eb924128694e00dae8e6fab084bfc5e3c3692e66663362019b182fcb31a48] START, creating Discoverer, Watch with file and sincedb collections
  12. logstash_1 | [2020-10-16T00:38:29,386][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}

这是我的日志文件:

  1. Oct 9 15:34:19 incollect drupal: http://dev.incollect.com|1602257659|DEV|52.202.31.67|http://dev.incollect.com/icadmin/inquires_report?q=icadmin/ajax_validate_and_fix_inquire_by_id|http://dev.incollect.com/icadmin/inquires_report|3||Validate inquireStep 0
  2. Oct 9 15:34:19 incollect drupal: http://dev.incollect.com|1602257659|DEV|52.202.31.67|http://dev.incollect.com/icadmin/inquires_report?q=icadmin/ajax_validate_and_fix_inquire_by_id|http://dev.incollect.com/icadmin/inquires_report|3||Validate inquireStep 1 - inquire_id:14219

已编辑************我正在添加docker compose文件,这是我对logstash的配置

  1. logstash:
  2. build:
  3. context: logstash/
  4. args:
  5. ELK_VERSION: $ELK_VERSION
  6. volumes:
  7. - type: bind
  8. source: ./logstash/config/logstash.yml
  9. target: /usr/share/logstash/config/logstash.yml
  10. read_only: true
  11. - type: bind
  12. source: ./logstash/pipeline
  13. target: /usr/share/logstash/pipeline
  14. read_only: true
  15. volumes:
  16. - ./../../:/usr/share/logstash
  17. ports:
  18. - "5000:5000/tcp"
  19. - "5000:5000/udp"
  20. - "9600:9600"
  21. environment:
  22. LS_JAVA_OPTS: "-Xmx256m -Xms256m"
  23. networks:
  24. - elk
  25. depends_on:
  26. - elasticsearch

我不知道是什么问题,我尝试了不同的解决方案,但它不起作用。

dzjeubhm

dzjeubhm1#

如果这是 - ./../../:/usr/share/logstash 您要使用什么来装载日志卷,您的日志存储文件输入路径应该指向 /usr/share/logstash/*.log

相关问题