在spark 2.4.0中使用spark.sql的配置单元表锁信息抛出错误

qltillow  于 2021-06-27  发布在  Hive
关注(0)|答案(1)|浏览(942)

无法在spark 2.4.0 shell中使用以下sql查询查询配置单元表锁信息。

spark.sql("show locks dbName.tableName")

低于错误

scala> spark.sql("show locks dbName.tableName")
org.apache.spark.sql.catalyst.parser.ParseException:
Operation not allowed: show locks(line 1, pos 0)

== SQL ==
show locks dbName.tableName
^^^
at 
org.apache.spark.sql.catalyst.parser.ParserUtils$.operationNotAllowed(ParserUtils.scala:39)
  at org.apache.spark.sql.execution.SparkSqlAstBuilder$$anonfun$visitFailNativeCommand$1.apply(SparkSqlParser.scala:1001)
  at org.apache.spark.sql.execution.SparkSqlAstBuilder$$anonfun$visitFailNativeCommand$1.apply(SparkSqlParser.scala:992)
  at org.apache.spark.sql.catalyst.parser.ParserUtils$.withOrigin(ParserUtils.scala:99)
  at org.apache.spark.sql.execution.SparkSqlAstBuilder.visitFailNativeCommand(SparkSqlParser.scala:992)
  at org.apache.spark.sql.execution.SparkSqlAstBuilder.visitFailNativeCommand(SparkSqlParser.scala:55)
  at org.apache.spark.sql.catalyst.parser.SqlBaseParser$FailNativeCommandContext.accept(SqlBaseParser.java:723)
  at org.antlr.v4.runtime.tree.AbstractParseTreeVisitor.visit(AbstractParseTreeVisitor.java:42)
  at org.apache.spark.sql.catalyst.parser.AstBuilder$$anonfun$visitSingleStatement$1.apply(AstBuilder.scala:71)
  at org.apache.spark.sql.catalyst.parser.AstBuilder$$anonfun$visitSingleStatement$1.apply(AstBuilder.scala:71)
  at org.apache.spark.sql.catalyst.parser.ParserUtils$.withOrigin(ParserUtils.scala:99)
  at org.apache.spark.sql.catalyst.parser.AstBuilder.visitSingleStatement(AstBuilder.scala:70)
  at org.apache.spark.sql.catalyst.parser.AbstractSqlParser$$anonfun$parsePlan$1.apply(ParseDriver.scala:69)
  at org.apache.spark.sql.catalyst.parser.AbstractSqlParser$$anonfun$parsePlan$1.apply(ParseDriver.scala:68)
  at org.apache.spark.sql.catalyst.parser.AbstractSqlParser.parse(ParseDriver.scala:97)
  at org.apache.spark.sql.execution.SparkSqlParser.parse(SparkSqlParser.scala:48)
  at org.apache.spark.sql.catalyst.parser.AbstractSqlParser.parsePlan(ParseDriver.scala:68)
  at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:637)
  ... 48 elided

但同样的查询在spark 1.6.0和hivecontext中也可以使用
我的用例是在spark2.4.0中获取spark作业中关于配置单元表的锁信息
如果有什么办法,我可以得到表锁信息在Spark作业?

4xy9mtcn

4xy9mtcn1#

请参见下面的链接。spark 2.4.0不支持锁
https://github.com/apache/spark/blob/branch-2.4/sql/catalyst/src/main/antlr4/org/apache/spark/sql/catalyst/parser/sqlbase.g4#l195
不支持本地命令| kw1=显示kw2=锁
https://dbdb.io/db/spark-sql 在上面的博客上也是。并发控制部分说,它不支持锁。

相关问题