如何在spark 3.0 preview中使用phoenix5.0

fykwrbwg  于 2023-05-29  发布在  Apache
关注(0)|答案(1)|浏览(169)
case "phoenix" =>{
        outputStream.foreachRDD(rdd=>{
          val spark=SparkSession.builder().config(rdd.sparkContext.getConf).getOrCreate()
          val ds=spark.createDataFrame(rdd,Class.forName(settings.BEAN_CLASS))
          ds.write.
            format("org.apache.phoenix.spark").
            mode(SaveMode.Overwrite).
            options(Map(
              "table" -> settings.OUTPUT_TABLENAME,
              "zkUrl" -> settings.ZK_URL,
              "zookeeper.znode.parent" -> settings.ZNODE_PARENT,
              "hbase.rootdir" -> settings.ROOT_DIR,
              "hbase.client.keyvalue.maxsize" -> "0")).
            save()
        })
      }

它可以与spark2.2和phoenix-4.12.0-HBase-1.2一起工作,但不能与spark3.0 preview和phoenix-5.0.0-HBase-2.0一起工作,如何修复它?

相关问题