在尝试使用spark FileLink写入配置单元表时,我会间歇性地遇到这个问题。
Caused by: java.lang.IllegalStateException: hdfs://node-master:8020/user/hive/warehouse/hive_table_name/_spark_metadata/0 doesn't exist when compacting batch 9 (compactInterval: 10)
我正在使用
spark.sql.orc.impl = native
hive.exec.dynamic.partition = true
hive.exec.dynamic.partition.mode = nonstrict
spark\u版本=2.3.1
val hiveOrcWriter: DataStreamWriter[Row] = event_stream
.writeStream
.outputMode("append")
.format("orc")
.partitionBy("year","month","day")
//.option("compression", "zlib")
.option("path", _table_loc)
.option("checkpointLocation", _table_checkpoint)
hiveOrcWriter.start().awaitTermination()
暂无答案!
目前还没有任何答案,快来回答吧!