无法将rdd注册为诱人的

s3fp2yjn  于 2021-06-26  发布在  Hive
关注(0)|答案(1)|浏览(186)

我正在使用intellij并尝试从mysql数据库获取数据,然后将其写入hive表。但是,我无法将我的rdd注册到临时表。错误为“无法解析符号寄存器可清空”。
我知道这个问题是由于一些进口丢失,但我不能找出哪一个。
我已经被这个问题困扰了很长一段时间,并尝试了堆栈溢出上所有可用的选项/答案。
下面是我的代码:

import java.sql.Driver
import org.apache.spark.{SparkConf, SparkContext}
import org.apache.spark.rdd.JdbcRDD
import java.sql.{Connection, DriverManager, ResultSet}
import org.apache.spark.sql.hive.HiveContext

object JdbcRddExample {

  def main(args: Array[String]): Unit = {
    val url = "jdbc:mysql://localhost:3306/retail_db"
    val username = "retail_dba"
    val password ="cloudera"
    val sqlContext = new org.apache.spark.sql.SQLContext(sc)

    val hiveContext = new HiveContext(sc)
    import hiveContext.implicits._

    Class.forName("com.mysql.jdbc.Driver").newInstance

    val conf = new SparkConf().setAppName("JDBC RDD").setMaster("local[2]").set("spark.executor.memory","1g")
    val sc = new SparkContext(conf)

    val myRDD = new JdbcRDD( sc, () => DriverManager.getConnection(url,username,password) ,
      "select department_id,department_name from departments limit ?,?",
      0,999999999,1,  r => r.getString("department_id") + ", " + r.getString("department_name"))

    myRDD.registerTempTable("My_Table") // error: Not able to resolve registerTempTable

    sqlContext.sql("use my_db")
    sqlContext.sql("Create table my_db.depts (department_id INT, department_name String")

我的sbt:(我相信我已经进口了所有的文物)

name := "JdbcRddExample"

version := "0.1"

scalaVersion := "2.11.12"

// https://mvnrepository.com/artifact/org.apache.spark/spark-core
libraryDependencies += "org.apache.spark" %% "spark-core" % "2.3.1"

// https://mvnrepository.com/artifact/org.apache.spark/spark-streaming
libraryDependencies += "org.apache.spark" %% "spark-streaming" % "2.3.1" % "provided"

libraryDependencies += "org.apache.spark" %% "spark-streaming" % "2.3.1"

// https://mvnrepository.com/artifact/org.apache.spark/spark-hive
libraryDependencies += "org.apache.spark" %% "spark-hive" % "2.3.1" % "provided"

// https://mvnrepository.com/artifact/org.apache.spark/spark-streaming
libraryDependencies += "org.apache.spark" %% "spark-streaming" % "2.3.1" % "provided"

// https://mvnrepository.com/artifact/com.typesafe.scala-logging/scala-logging
libraryDependencies += "com.typesafe.scala-logging" %% "scala-logging" % "3.7.1"

// https://mvnrepository.com/artifact/org.apache.spark/spark-core
libraryDependencies += "org.apache.spark" %% "spark-core" % "2.3.1"

libraryDependencies += "org.apache.logging.log4j" % "log4j-api" % "2.11.0"
libraryDependencies += "org.apache.logging.log4j" % "log4j-core" % "2.11.0"

// https://mvnrepository.com/artifact/org.apache.spark/spark-sql
libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.3.1"

libraryDependencies ++= Seq(
  "org.apache.spark" %% "spark-core" % "2.3.1",
  "org.apache.spark" %% "spark-sql" % "2.3.1",
  "org.apache.spark" %% "spark-mllib" % "2.3.1",
  "mysql" % "mysql-connector-java" % "5.1.12"
)
// https://mvnrepository.com/artifact/org.apache.spark/spark-hive
libraryDependencies += "org.apache.spark" %% "spark-hive" % "2.3.1" % "provided"

// https://mvnrepository.com/artifact/org.apache.spark/spark-sql
libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.3.1"

请给我指出我遗漏的确切的进口商品。或者有别的办法。就像我之前提到的,我已经尝试了所有的解决方案,但到目前为止没有任何效果。

uemypmqf

uemypmqf1#

要使用sparksql,您可能需要一个Dataframe而不是rdd,rdd显然不具备 registerTempTable .
您可以通过将rdd转换为Dataframe来快速解决问题,例如如何在spark中将rdd对象转换为Dataframe。但建议使用sparksql特性来读取jdbc数据源,如这里的示例。示例代码:

val dfDepartments = sqlContext.read.format("jdbc")
  .option("url", url)
  .option("driver", "com.mysql.jdbc.Driver")
  .option("dbtable", "(select department_id,department_name from departments) t")
  .option("user", username)
  .option("password", password).load()
dfDepartments.createOrReplaceTempView("My_Table")

相关问题