case "phoenix" =>{
outputStream.foreachRDD(rdd=>{
val spark=SparkSession.builder().config(rdd.sparkContext.getConf).getOrCreate()
val ds=spark.createDataFrame(rdd,Class.forName(settings.BEAN_CLASS))
ds.write.
format("org.apache.phoenix.spark").
mode(SaveMode.Overwrite).
options(Map(
"table" -> settings.OUTPUT_TABLENAME,
"zkUrl" -> settings.ZK_URL,
"zookeeper.znode.parent" -> settings.ZNODE_PARENT,
"hbase.rootdir" -> settings.ROOT_DIR,
"hbase.client.keyvalue.maxsize" -> "0")).
save()
})
}
它可以与spark2.2和phoenix-4.12.0-HBase-1.2一起工作,但不能与spark3.0 preview和phoenix-5.0.0-HBase-2.0一起工作,如何修复它?
1条答案
按热度按时间4jb9z9bj1#
使用phoenix 5-spark 3,但它是快照