问题
我试图从配置单元表中读取,但收到以下错误:
[error] (run-main-0) org.apache.spark.sql.AnalysisException: Table or view not found: tags; line 1 pos 14
我已经把 hive-site.xml
两者都有 $SPARK_HOME/conf
以及 $HIVE_HOME/conf
. 同样,我使用sqoop从mysql获取数据并将其导入到hive中也没有遇到任何问题。我的scala代码有问题吗?或者这是一个配置错误?
scala代码:
package test1
import java.io.File
import org.apache.spark.sql.Row
import org.apache.spark.sql.SparkSession
case class Movie(movieid: String, title: String, genres: String)
case class Tag(userid: String, title: String, tag: String)
object SparkHiveTest {
def main(args: Array[String]) {
val warehouseLocation = new File("spark-warehouse").getAbsolutePath
val spark = SparkSession
.builder()
.master("local")
.appName("SparkHiveExample")
.config("spark.sql.warehouse.dir", warehouseLocation)
.enableHiveSupport()
.getOrCreate()
spark.sql("SELECT * FROM tags").show()
spark.stop()
}
}
hive-site.xml:
<configuration>
<property>
<name>javax.jdo.option.ConnectionURL</name>
<value>jdbc:mysql://localhost/metastore?createDatabaseIfNotExist=true</value>
<description>metadata is stored in a MySQL server</description>
</property>
<property>
<name>javax.jdo.option.ConnectionDriverName</name>
<value>com.mysql.jdbc.Driver</value>
<description>MySQL JDBC driver class</description>
</property>
<property>
<name>javax.jdo.option.ConnectionUserName</name>
<value>hiveuser</value>
<description>user name for connecting to mysql server</description>
</property>
<property>
<name>javax.jdo.option.ConnectionPassword</name>
<value>hivepass</value>
<description>password for connecting to mysql server</description>
</property>
</configuration>
2条答案
按热度按时间r7s23pms1#
根据hivecontext的api文档:
从类路径上的hive-site.xml读取与存储在hive.configuration中的数据集成的spark sql执行引擎的示例。
因此,一定要把你的
hive-site.xml
到ide中项目的资源文件夹中。它立刻解决了我的问题。
k7fdbhmy2#
确保配置单元元存储已正确配置: