ApacheKafka不使用api

pn9klfpd  于 2021-06-06  发布在  Kafka
关注(0)|答案(1)|浏览(319)

kafka-console-producer.sh和kafka-console-consumer.sh的console命令运行正常,但当我尝试使用api生成或使用时,我无法执行!有人能告诉我我的scala代码有什么问题吗?

import java.util.Properties

import org.apache.kafka.clients.producer.{KafkaProducer, ProducerRecord}

object ScalaProducerExample  {
  val topic = "test"
  val brokers = "<broker>:9092"
  val props = new Properties()
  props.put("bootstrap.servers", brokers)
  props.put("client.id", "ScalaProducerExample")
  props.put("key.serializer", "org.apache.kafka.common.serialization.StringSerializer")
  props.put("value.serializer", "org.apache.kafka.common.serialization.StringSerializer")
  val producer = new KafkaProducer[String, String](props)
  val data = new ProducerRecord[String, String](topic, "message")
  producer.send(data)
  producer.close()
}

这是加载在build.sbt文件中的依赖项:

libraryDependencies += "org.apache.kafka" % "kafka-clients" % "0.8.2.1"

libraryDependencies += "org.apache.kafka" %% "kafka" % "0.10.2.0"

我甚至用java编写了它,同样的情况也在发生。

import org.apache.kafka.clients.ClientRequest;
import org.apache.kafka.clients.ClientResponse;
import org.apache.kafka.clients.KafkaClient;
import org.apache.kafka.clients.RequestCompletionHandler;
import org.apache.kafka.clients.producer.KafkaProducer;
import org.apache.kafka.clients.producer.Producer;
import org.apache.kafka.clients.producer.ProducerRecord;
import org.apache.kafka.common.Node;
import org.apache.kafka.common.requests.AbstractRequest;

import java.io.IOException;
import java.util.Date;
import java.util.List;
import java.util.Properties;
import java.util.Random;

public class ProducerExample {
    public static void main(String[] args) {
        String topic = "test";
        String brokers = "<broker>:9092";
        System.out.println("init " );
        Properties props = new Properties();
        props.put("bootstrap.servers", brokers);
        props.put("acks", "all");
        props.put("retries", 0);
        props.put("batch.size", 16384);
        props.put("linger.ms", 1);
        props.put("buffer.memory", 33554432);

        props.put("key.serializer", "org.apache.kafka.common.serialization.StringSerializer");
        props.put("value.serializer", "org.apache.kafka.common.serialization.StringSerializer");

        System.out.println("creating prducer " );
        KafkaProducer<String, String> producer = new KafkaProducer<String, String>(props);
        producer.flush();
        producer.send(new ProducerRecord<>(topic, "1", "2"));
        producer.close();
        System.out.println("close  " );
    }
}

build.sbt中的依赖项是:

libraryDependencies += "org.apache.kafka" % "kafka-clients" % "0.8.2.1"

我知道连接是有效的,因为当我更改代理时,我收到一个错误。但是当代理正确时,程序运行成功,但我没有收到任何消息。
更新:我假设程序成功运行的原因是它给出了超时。我查过了

try {
            producer.send(new ProducerRecord<>(topic, "1", "2")).get(30, TimeUnit.SECONDS);
        } catch (InterruptedException e) {
            e.printStackTrace();
        } catch (ExecutionException e) {
            e.printStackTrace();
        } catch (TimeoutException e) {
            e.printStackTrace();
        }

出现了这个错误:

java.util.concurrent.TimeoutException: Timeout after waiting for 30000 ms.
        at org.apache.kafka.clients.producer.internals.FutureRecordMetadata.get(FutureRecordMetadata.java:64)
        at org.apache.kafka.clients.producer.internals.FutureRecordMetadata.get(FutureRecordMetadata.java:25)
        at de.innocow.kafka.ProducerExample.main(ProducerExample.java:45)

我如何调试更多的内容并调查生产者为什么不发送?

rjzwgtxy

rjzwgtxy1#

producer.send(new ProducerRecord<>(topic, "1", "2"));
producer.flush();            
producer.close();

试试这个,看看文档:

The flush() call gives a convenient way to ensure all previously sent messages have actually completed. 
 This example shows how to consume from one Kafka topic and produce to another Kafka topic:
for(ConsumerRecord<String, String> record: consumer.poll(100))
     producer.send(new ProducerRecord("my-topic", record.key(), record.value());
 producer.flush();
 consumer.commit();

相关问题