从本地spark 2x程序查询远程配置单元表

unguejic  于 2021-06-26  发布在  Hive
关注(0)|答案(0)|浏览(232)

当我从eclipse运行本地spark2x程序时,出现以下错误:
线程“main”org.apache.spark.sql.analysisexception中出现异常:设置hive.metastore.uris时,请将spark.sql.authorization.enabled和hive.security.authorization.enabled设置为true以启用授权。;
使用的代码:

System.setProperty("hadoop.home.dir", "D:/winutils");
//kerberos releated
val ZKServerPrincipal = "zookeeper/hadoop.hadoop.com";
val ZOOKEEPER_DEFAULT_LOGIN_CONTEXT_NAME = "Client";
val ZOOKEEPER_SERVER_PRINCIPAL_KEY = "zookeeper.server.principal";
val hadoopConf: Configuration = new Configuration(); 
LoginUtil.setZookeeperServerPrincipal(ZOOKEEPER_SERVER_PRINCIPAL_KEY, ZKServerPrincipal);
LoginUtil.login(userPrincipal, userKeytabPath, krb5ConfPath, hadoopConf);
//creating spark session    
val spark = SparkSession .builder() .appName("conenction").config("spark.master", "local") .config("spark.sql.authorization.enabled","true") .enableHiveSupport() .getOrCreate() 
    val df75 = spark.sql("select * from dbname.tablename limit 10")

暂无答案!

目前还没有任何答案,快来回答吧!

相关问题