sbt对kafka+flink示例进行故障排除?

p8h8hvxi  于 2021-06-08  发布在  Kafka
关注(0)|答案(1)|浏览(464)

刚接触kafka/flink/scala/sbt组合并尝试设置以下内容
多主题kafka队列
使用scala-jar的flink流作业
一个scala jar,它从一个主题中读取数据,处理数据,然后将数据推送到另一个主题
现在正常运行
能够正确设置Kafka和Flink。
能够使用flink binary附带的kafka.jar示例读取kafka队列。
能够创建wordcount jar(感谢ipoteka)
现在尝试创建一个流字计数jar,但是遇到了sbt问题
现在,在尝试实际的kafka/spark流媒体示例之前,尝试创建一个示例wordcount.jar。
但遇到西姆斯伯特,我就不知道我忽略了什么。
如果我有任何不必要的声明,也请告诉我。
如果有人共享一个简单的程序来读/写kakfa队列,也将不胜感激。
项目设置-

|- project/plugins.sbt
|- build.sbt
|- src/main/scala/WordCount.scala

构建.sbt

name := "Kakfa-Flink Project"

version := "1.0"

libraryDependencies += "org.apache.spark" %% "spark-core" % "1.2.0"

// Updated : Correction pointed by ipoteka 
libraryDependencies += "org.apache.kafka" % "kafka_2.10" % "0.10.0.0"

libraryDependencies += "org.apache.flink" %% "flink-scala" % "1.0.0"

libraryDependencies += "org.apache.flink" %% "flink-clients" % "1.0.0"

libraryDependencies += "org.apache.flink" %% "flink-streaming-scala" % "1.0.0"

// for jar building
mainClass in compile := Some("StreamWordCount")

插件.sbt

//***creating fat jar
addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.14.1")

字数.scala

package prog

import org.apache.flink.api.scala._
import org.apache.flink.streaming.api.scala.DataStream
import org.apache.flink.streaming.api.windowing.time.Time

object WordCount {

  type WordCount = (String, Int)

  def main(lines: DataStream[String], stopWords: Set[String], window: Time): DataStream[WordCount] = {
    lines
      .flatMap(line => line.split(" "))
      .filter(word => !word.isEmpty)
      .map(word => word.toLowerCase)
      .filter(word => !stopWords.contains(word))
      .map(word => (word, 1))
      .keyBy(0)
      .timeWindow(window)
      .sum(1)
  }

}

streamwordcount.scala文件

package prog

import org.apache.flink.streaming.api.scala._
import org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer082
import org.apache.flink.streaming.util.serialization.SimpleStringSchema

import org.apache.flink.api.scala._
import org.apache.flink.streaming.api.scala.DataStream
import org.apache.flink.streaming.api.windowing.time.Time

object Main {
  def main(args: Array[String]) {

  type WordCount = (String, Int)

    val env = StreamExecutionEnvironment.getExecutionEnvironment
    val properties = new Properties()
    properties.setProperty("bootstrap.servers", "localhost:9092")
    properties.setProperty("zookeeper.connect", "localhost:2181")
    properties.setProperty("group.id", "test")
    val stream = env
      .addSource(new FlinkKafkaConsumer082[String]("topic", new SimpleStringSchema(), properties))
      .flatMap(line => line.split(" "))
      .filter(word => !word.isEmpty)
      .map(word => word.toLowerCase)
      .filter(word => !stopWords.contains(word))
      .map(word => (word, 1))
      .keyBy(0)
      .timeWindow(window)
      .sum(1)
      .print

    env.execute("Flink Kafka Example")
  }
}

创建jar时出错(已更新)

[vagrant@streaming ex]$ /opt/sbt/bin/sbt  package
    [error] /home/vagrant/ex/src/main/scala/StreamWordCount.scala:4: object connectors is not a member of package org.apache.flink.streaming
[error] import org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer082
[error]                                   ^
[error] /home/vagrant/ex/src/main/scala/StreamWordCount.scala:18: not found: type Properties
[error]     val properties = new Properties()
[error]                          ^
[error] /home/vagrant/ex/src/main/scala/StreamWordCount.scala:23: not found: type FlinkKafkaConsumer082
[error]       .addSource(new FlinkKafkaConsumer082[String]("topic", new SimpleStringSchema(), properties))
[error]                      ^
[error] three errors found
[error] (compile:compileIncremental) Compilation failed
[error] Total time: 31 s, completed Jul 3, 2016 9:02:18 PM
tct7dpnv

tct7dpnv1#

你从哪里得到这些版本的?我没看到Kafka被释放 1.0.0 . 看看maven(按sbt键):

libraryDependencies += "org.apache.kafka" % "kafka_2.10" % "0.10.0.0"

我还建议您检查所有其他版本。Spark电流释放 1.6.2 ,例如。

相关问题