我正在尝试构建一个系统,需要从代理收集数据并将它们推送到kafka服务器(通过logstash)。配置kafka服务器后。。我刚刚测试了消费者和生产者(Kafka)在当地的anh一切都还可以。但是当我试图从logstash代理(在远程服务器上)将数据推送到kafka并使用consumer进行监视时,什么都没有发生,没有推送的数据请告诉我一些提示??我的配置如下:
日志存储配置:
output {
stdout { codec => rubydebug }
kafka {
bootstrap_servers => "public_IP_remote_server"
topic_id => "iis"
}
}
consumer.proper(Kafka服务器)
zookeeper.connect=*public_IP_remote_server*:2181
zookeeper.connection.timeout.ms=6000
group.id=test-consumer-group
producer.proper(Kafka服务器)
bootstrap.servers=*public_IP_remote_server*:9092
compression.type=none
server.proper(Kafka服务器)
broker.id=0
advertised.host.name=*public_IP_remote_server*
advertised.port=9092
listeners=PLAINTEXT://*public_IP_remote_server*:9092
delete.topic.enable = true
advertised.listeners=PLAINTEXT://*public_IP_remote_server*:9092
num.network.threads=3
num.io.threads=8
socket.send.buffer.bytes=102400
socket.receive.buffer.bytes=102400
socket.request.max.bytes=104857600
log.dirs=/tmp/kafka-logs
num.partitions=1
log.retention.hours=168
log.segment.bytes=1073741824
log.retention.check.interval.ms=300000
zookeezookeeper.connection.timeout.ms=6000
zookeeper.connect=localhost:2181
zookeeper.proper(在kafka服务器上)
dataDir=/tmp/zookeeper
clientPort=2181
maxClientCnxns=0
暂无答案!
目前还没有任何答案,快来回答吧!