编译语句时出错:失败:parseexception行1:14无法识别联接源中“select'*''from”附近的输入

t9eec4r0  于 2021-06-26  发布在  Hive
关注(0)|答案(0)|浏览(377)

我有一组scala,在这里我试图使用spark将数据加载到配置单元表中,而连接配置单元表时我遇到以下错误
代码

val spark = SparkSession.builder().master("local[2]").appName("interfacing spark sql to hive metastore without configuration file")
      .config("hive.metastore.warehouse.dir", "C:\\xxxx\\xxxx\\xxxx\\")
      .enableHiveSupport() // don't forget to enable hive support
      .getOrCreate()

    val sc = spark.sparkContext
    val sqlContext = spark.sqlContext
    val driverName = "org.apache.hive.jdbc.HiveDriver"

    System.setProperty("javax.net.ssl.trustStore", "C:\\xxxx\\xxxx\\xxxx\\xxxx\\xxxx\\security\\jssecacerts")
    System.setProperty("java.security.krb5.debug","true")
    System.setProperty("java.security.krb5.conf",new File("C:\\xxxx\\xxxx\\krb5.conf").getAbsolutePath)
    System.setProperty("javax.security.auth.useSubjectCredsOnly", "false")
    System.setProperty("java.security.auth.login.config", new File("C:\\xxxx\\xxxx\\jaas.conf").getAbsolutePath)
    val hiveurl="jdbc:hive2://xxxxx.octorp.com:10000/devl_dkp;user=pcpdosr;password=Kopdevp1;ssl=true;AuthMech=3"
    //;mapred.job.queue.name=dkl"
    val connectionProperties = new java.util.Properties()

    sc.setLocalProperty("spark.scheduler.pool", "dkl")
    val hiveQuery = "select * from devl_dkp.employee"

    val hiveResult = spark.read.option("driver",driverName).jdbc(hiveurl, hiveQuery, connectionProperties).collect()

错误

exception caught: org.apache.hive.service.cli.HiveSQLException: Error while compiling statement: FAILED: ParseException line 1:14 cannot recognize input near 'select' '*' 'from' in join source

任何帮助都将不胜感激

暂无答案!

目前还没有任何答案,快来回答吧!

相关问题