配置单元合并命令在spark hivecontext中不起作用

m1m5dgzv  于 2021-05-29  发布在  Hadoop
关注(0)|答案(2)|浏览(461)

我正在1.6.3 spark版本中使用spark hivecontext运行配置单元合并命令,但由于以下错误而失败。

2017-09-11 18:30:33 Driver [INFO ] ParseDriver - Parse Completed
2017-09-11 18:30:34 Driver [INFO ] ParseDriver - Parsing command: MERGE INTO emp_with_orc AS T USING SOURCE_TABLE AS S 
ON T.id = S.id 
WHEN MATCHED AND (S.operation = 1) THEN UPDATE SET a = S.a,b = S.b 
WHEN MATCHED AND (S.operation = 2) THEN DELETE 
WHEN NOT MATCHED THEN INSERT VALUES (S.id, S.a, S.b)
2017-09-11 18:30:34 Driver [ERROR] HiveWriter - Error while executing the merge query.
org.apache.spark.sql.AnalysisException: cannot recognize input near 'MERGE' 'INTO' 'emp_with_orc'; line 1 pos 0
    at org.apache.spark.sql.hive.HiveQl$.createPlan(HiveQl.scala:318)
    at org.apache.spark.sql.hive.ExtendedHiveQlParser$$anonfun$hiveQl$1.apply(ExtendedHiveQlParser.scala:41)
    at org.apache.spark.sql.hive.ExtendedHiveQlParser$$anonfun$hiveQl$1.apply(ExtendedHiveQlParser.scala:40)
    at scala.util.parsing.combinator.Parsers$Success.map(Parsers.scala:136)
    at scala.util.parsing.combinator.Parsers$Success.map(Parsers.scala:135)
    at scala.util.parsing.combinator.Parsers$Parser$$anonfun$map$1.apply(Parsers.scala:242)
    at scala.util.parsing.combinator.Parsers$Parser$$anonfun$map$1.apply(Parsers.scala:242)
    at scala.util.parsing.combinator.Parsers$$anon$3.apply(Parsers.scala:222)
    at scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1$$anonfun$apply$2.apply(Parsers.scala:254)
    at scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1$$anonfun$apply$2.apply(Parsers.scala:254)
    at scala.util.parsing.combinator.Parsers$Failure.append(Parsers.scala:202)
    at scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1.apply(Parsers.scala:254)
    at scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1.apply(Parsers.scala:254)
    at scala.util.parsing.combinator.Parsers$$anon$3.apply(Parsers.scala:222)

我不确定spark的hivecontext中是否支持acid事务合并命令。
任何帮助都将不胜感激。

bbuxkriu

bbuxkriu1#

使用 MERGE 操作您将需要通过配置单元jdbc执行它,因为从现在起sparksql不支持merge。

lzfw57am

lzfw57am2#

spark不支持 UPDATES 或者 DELETES 所以异常是一种预期的行为。

相关问题