错误:长度为负-62

ht4b089n  于 2021-06-04  发布在  Kafka
关注(0)|答案(0)|浏览(189)

我面临着与本文类似的问题:https://discuss.elastic.co/t/argumenterror-when-using-kafka-input-avro-codec/116975
日志存储配置:

  1. input {
  2. kafka{
  3. group_id => "group_1"
  4. topics => ["topic_1"]
  5. bootstrap_servers => "192.168.0.1:9092"
  6. codec => avro {
  7. schema_uri => "/files/GA6/logstash-6.0.0/CONFIG_HOME/myschema.avsc"
  8. }
  9. }
  10. }
  11. output{
  12. stdout{
  13. }
  14. }

错误日志:

  1. [2018-01-25T11:54:37,060][FATAL][logstash.runner ] An unexpected error occurred!
  2. {:error=>#<ArgumentError: negative length -15 given>, :backtrace=>[
  3. "org/jruby/ext/stringio/StringIO.java:788:in `read'",
  4. "/files/GA6/logstash-6.0.0/vendor/bundle/jruby/2.3.0/gems/avro-1.8.2/lib/avro/io.rb:106:in `read'",
  5. "/files/GA6/logstash-6.0.0/vendor/bundle/jruby/2.3.0/gems/avro-1.8.2/lib/avro/io.rb:93:in `read_bytes'",
  6. "/files/GA6/logstash-6.0.0/vendor/bundle/jruby/2.3.0/gems/avro-1.8.2/lib/avro/io.rb:99:in `read_string'",
  7. "/files/GA6/logstash-6.0.0/vendor/bundle/jruby/2.3.0/gems/avro-1.8.2/lib/avro/io.rb:299:in `read_data'",
  8. "/files/GA6/logstash-6.0.0/vendor/bundle/jruby/2.3.0/gems/avro-1.8.2/lib/avro/io.rb:384:in `block in read_record'",
  9. "org/jruby/RubyArray.java:1734:in `each'",
  10. "/files/GA6/logstash-6.0.0/vendor/bundle/jruby/2.3.0/gems/avro-1.8.2/lib/avro/io.rb:382:in `read_record'",
  11. "/files/GA6/logstash-6.0.0/vendor/bundle/jruby/2.3.0/gems/avro-1.8.2/lib/avro/io.rb:310:in `read_data'",
  12. "/files/GA6/logstash-6.0.0/vendor/bundle/jruby/2.3.0/gems/avro-1.8.2/lib/avro/io.rb:275:in `read'",
  13. "/files/GA6/logstash-6.0.0/vendor/bundle/jruby/2.3.0/gems/logstash-codec-avro-3.2.3-java/lib/logstash/codecs/avro.rb:77:in `decode'",
  14. "/files/GA6/logstash-6.0.0/vendor/bundle/jruby/2.3.0/gems/logstash-input-kafka-8.0.2/lib/logstash/inputs/kafka.rb:254:in `block in thread_runner'",
  15. "/files/GA6/logstash-6.0.0/vendor/bundle/jruby/2.3.0/gems/logstash-input-kafka-8.0.2/lib/logstash/inputs/kafka.rb:253:in `block in thread_runner'"
  16. ]}

架构示例:

  1. {
  2. "type": "record",
  3. "name": "Sample",
  4. "doc": "Sample Schema",
  5. "fields": [{
  6. "name": "name",
  7. "type": "string"
  8. }, {
  9. "name": "address",
  10. "type": "string"
  11. }, {
  12. "name": "salary",
  13. "type": "long"
  14. }
  15. ]
  16. }

根据一些讨论,我还补充了以下内容:

  1. key_deserializer_class => "org.apache.kafka.common.serialization.ByteArrayDeserializer"
  2. value_deserializer_class => "org.apache.kafka.common.serialization.ByteArrayDeserializer"

但问题仍然存在。。。
如果你需要进一步的信息,请告诉我。

暂无答案!

目前还没有任何答案,快来回答吧!

相关问题