如何让第一个 Storm Kafka演示工作?

6ie5vjzr  于 2022-12-09  发布在  Apache
关注(0)|答案(1)|浏览(267)

提交一个简单的第一个Storm拓扑让我抓狂。首先,当我试图在安装了Storm的远程VM上运行它时,连接被拒绝。我明白,我必须在本地集群上运行它。
这是我的代码行

public void execTopology_alt() throws Exception {

    final TopologyBuilder tp = new TopologyBuilder();
    tp.setSpout("kafka_spout", new KafkaSpout<>(KafkaSpoutConfig.builder(this.bootstrapServers, topic).build()), 1);
    tp.setBolt("bolt", new LoggerBolt()).shuffleGrouping("kafka_spout");

    Config conf = new Config();

    LocalCluster cluster = new LocalCluster();
    cluster.submitTopology("kafkaboltTest", conf, tp.createTopology());
   }

似乎有两个问题,让我忙碌了几个小时。
1)找不到类

20:48:40.202 [main] INFO org.apache.storm.daemon.metrics.ClientMetricsUtils - Using statistics reporter plugin:org.apache.storm.daemon.metrics.reporters.JmxPreparableReporter
20:48:40.203 [main] INFO org.apache.storm.daemon.metrics.reporters.JmxPreparableReporter - Preparing...
Exception in thread "main" java.lang.NoClassDefFoundError: com/codahale/metrics/JmxReporter
    at org.apache.storm.daemon.metrics.reporters.JmxPreparableReporter.prepare(JmxPreparableReporter.java:32)
    at org.apache.storm.metric.StormMetricsRegistry.startMetricsReporters(StormMetricsRegistry.java:74)
    at org.apache.storm.LocalCluster.<init>(LocalCluster.java:287)
    at org.apache.storm.LocalCluster.<init>(LocalCluster.java:159)
    at tki.bigdata.storm.StormTopology.execTopology_alt(StormTopology.java:93)
    at tki.bigdata.storm.StormTopology.main(StormTopology.java:46)
Caused by: java.lang.ClassNotFoundException: com.codahale.metrics.JmxReporter
    at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
    ... 6 more

我发布了我的pom.xml,我也在使用Sping Boot 和Spring Kafka(Kafka作品)。也许那和Storm的东西一起不好用?

<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
    <modelVersion>4.0.0</modelVersion>
    <parent>
        <groupId>org.springframework.boot</groupId>
        <artifactId>spring-boot-starter-parent</artifactId>
        <version>2.1.5.RELEASE</version>
        <relativePath /> <!-- lookup parent from repository -->
    </parent>
    <groupId>tki.bigdata</groupId>
    <artifactId>RealTime</artifactId>
    <version>0.0.1-SNAPSHOT</version>
    <name>RealTime</name>
    <description>Demo project for Spring Boot</description>

    <properties>
        <java.version>1.8</java.version>
    </properties>

    <dependencies>

        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-starter-web</artifactId>
            <exclusions>
                <exclusion>
                    <groupId>log4j</groupId>
                    <artifactId>*</artifactId>
                </exclusion>
                <exclusion>
                    <groupId>org.slf4j</groupId>
                    <artifactId>*</artifactId>
                </exclusion>
                <exclusion>
                    <groupId>org.apache.logging.log4j</groupId>
                    <artifactId>*</artifactId>
                </exclusion>
            </exclusions>
        </dependency>



        <dependency>
            <groupId>org.springframework.kafka</groupId>
            <artifactId>spring-kafka</artifactId>
        </dependency>

        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-starter-test</artifactId>
            <scope>test</scope>
        </dependency>
        <dependency>
            <groupId>org.springframework.kafka</groupId>
            <artifactId>spring-kafka-test</artifactId>
            <scope>test</scope>
        </dependency>

        <!-- https://mvnrepository.com/artifact/org.apache.storm/storm-server -->
<dependency>
    <groupId>org.apache.storm</groupId>
    <artifactId>storm-server</artifactId>
    <version>2.0.0</version>

</dependency>

        <!-- https://mvnrepository.com/artifact/org.apache.storm/storm-core -->
        <dependency>
            <groupId>org.apache.storm</groupId>
            <artifactId>storm-core</artifactId>
            <version>2.0.0</version>

        </dependency>



        <!-- https://mvnrepository.com/artifact/org.apache.storm/storm-kafka -->
        <dependency>
            <groupId>org.apache.storm</groupId>
            <artifactId>storm-kafka</artifactId>
            <version>1.2.2</version>

        </dependency>


        <!-- https://mvnrepository.com/artifact/org.apache.storm/storm-kafka-client -->
        <dependency>
            <groupId>org.apache.storm</groupId>
            <artifactId>storm-kafka-client</artifactId>
            <version>2.0.0</version>

        </dependency>


    </dependencies>

    <build>
        <plugins>
            <plugin>
                <groupId>org.springframework.boot</groupId>
                <artifactId>spring-boot-maven-plugin</artifactId>
            </plugin>
        </plugins>
    </build>


</project>

2)当我运行它时,会有大量的输出,这些输出会永远重复

20:48:40.968 [main-SendThread(localhost:2004)] DEBUG org.apache.storm.shade.org.apache.zookeeper.ClientCnxn - Reading reply sessionid:0x1000fc645fb0005, packet:: clientPath:null serverPath:null finished:false header:: 20,4  replyHeader:: 20,39,0  request:: '/storm/supervisors/6de1c57f-06e9-49ae-984f-09b5ad7d05e5,F  response:: #1fffffff8b80000000354e41effffff8230105c1022ffffffd018ffffffe2ffffffc1ffffff8377ffffffaf6dffffffaaffffffa160f267cffffffc432d2531ffffffa6ffffffb66930115fffffffe0737cffffff94fffffffb1ffffff8d38ffffff87ffffff9dffffffddffffffd9ffffff9dffffffdd2d202ffffff8f737a7a21ffffff887dffffffb6ffffffd1ffffff9d14ffffffc65cffffffb175ffffff86cffffffee323effffff8930f4bffffffdfffffffdbffffffd5ffffffbdffffffda4bffffffd6cffffff98ffffffd6ffffff8affffffe3ffffff8a77afffffff363ffffffe54b2e58ffffffd7373d65ffffff8affffff95ffffff90146165969ffffffe0ffffffe0fffffff34860ffffffe668ffffffe6ffffffb8ffffff80ffffffe5ffffffac1ffffff82ffffffccffffffc7fffffff44028ffffffa12bffffffc85118effffffbfffffffacffffffa5ffffffbd132bffffff8d53ffffffc42a27ffffffd56d6cffffffa7ffffffbf772ffffffadffffffb47113ffffffd1ffffffa27dffffffffffffff94cfffffffcffffffc1ffffffed7ffffffbe7fffffffda34ffffffc2ffffffd4000,s{35,35,1560624520123,1560624520123,0,0,0,72074938289946636,186,0,35} 
20:48:40.968 [SyncThread:0] DEBUG org.apache.storm.shade.org.apache.zookeeper.server.FinalRequestProcessor - Processing request:: sessionid:0x1000fc645fb0005 type:exists cxid:0x15 zxid:0xfffffffffffffffe txntype:unknown reqpath:/storm/supervisors/5488b159-db82-42d0-b24e-354c59ddce34
20:48:40.969 [SyncThread:0] DEBUG org.apache.storm.shade.org.apache.zookeeper.server.FinalRequestProcessor - sessionid:0x1000fc645fb0005 type:exists cxid:0x15 zxid:0xfffffffffffffffe txntype:unknown reqpath:/storm/supervisors/5488b159-db82-42d0-b24e-354c59ddce34
20:48:40.969 [main-SendThread(localhost:2004)] DEBUG org.apache.storm.shade.org.apache.zookeeper.ClientCnxn - Reading reply sessionid:0x1000fc645fb0005, packet:: clientPath:null serverPath:null finished:false header:: 21,3  replyHeader:: 21,39,0  request:: '/storm/supervisors/5488b159-db82-42d0-b24e-354c59ddce34,F  response:: s{39,39,1560624520167,1560624520167,0,0,0,72074938289946638,185,0,39} 
20:48:40.969 [SyncThread:0] DEBUG org.apache.storm.shade.org.apache.zookeeper.server.FinalRequestProcessor - Processing request:: sessionid:0x1000fc645fb0005 type:getData cxid:0x16 zxid:0xfffffffffffffffe txntype:unknown reqpath:/storm/supervisors/5488b159-db82-42d0-b24e-354c59ddce34
20:48:40.969 [SyncThread:0] DEBUG org.apache.storm.shade.org.apache.zookeeper.server.FinalRequestProcessor - sessionid:0x1000fc645fb0005 type:getData cxid:0x16 zxid:0xfffffffffffffffe txntype:unknown reqpath:/storm/supervisors/5488b159-db82-42d0-b24e-354c59ddce34
ylamdve6

ylamdve61#

问题出在你的POM和Kafka的设置上。
删除storm-kafka,你不需要它。它是旧的Kafka集成的 Storm ,storm-kafka-client取代它。
我会避免将Spring与Storm一起使用。Storm不了解Springbean,因此您的Spring设置可能只能在本地集群上工作,而不能在生产设置上工作。
您需要将storm-client放在类路径上。将其设置为provided作用域。您也可以删除storm-core
很可能你需要在你的spout Kafka配置中设置一个消费者组。
https://github.com/apache/storm/blob/v2.0.0/examples/storm-kafka-client-examples有一个完整的例子,我将把它作为起点。

相关问题