如何将saveastable保存到hive(以便hive表是托管的\u表)?

wlwcrazw  于 2021-06-26  发布在  Hive
关注(0)|答案(2)|浏览(374)

当我尝试保存没有显式路径的表时,hivemetastore将有一个伪“path”属性指向“/user/hive/warehouse”,而不是“/hive/warehouse”。如果我使用.option(“path”,“/hive/warehouse”)显式地将路径设置为我想要的路径,那么除了hive创建了一个外部表之外,其他一切都可以工作。有没有一种方法可以将托管表保存到配置单元元存储区,而不具有与配置单元中的文件位置不匹配的伪路径属性?

from pyspark.sql import SparkSession

spark = SparkSession.builder.master(master_url).enableHiveSupport().getOrCreate()

df = spark.range(100)

df.write.saveAsTable("test1")
df.write.option("path", "/hive/warehouse").saveAsTable("test2")

hive> describe formatted test1;
OK

# col_name              data_type               comment

id                      bigint                                      

# Detailed Table Information

Database:               default                  
Owner:                  root                     
CreateTime:             Fri Mar 10 18:53:07 UTC 2017     
LastAccessTime:         UNKNOWN                  
Protect Mode:           None                     
Retention:              0                        
Location:               file:/hive/warehouse/test1 
Table Type:             MANAGED_TABLE            
Table Parameters:        
    spark.sql.sources.provider  parquet             
    spark.sql.sources.schema.numParts   1                   
    spark.sql.sources.schema.part.0 {\"type\":\"struct\",\"fields\":[{\"name\":\"id\",\"type\":\"long\",\"nullable\":true,\"metadata\":{}}]}
    transient_lastDdlTime   1489171987          

# Storage Information

SerDe Library:          org.apache.hadoop.hive.ql.io.parquet.serde.ParquetHiveSerDe  
InputFormat:            org.apache.hadoop.hive.ql.io.parquet.MapredParquetInputFormat    
OutputFormat:           org.apache.hadoop.hive.ql.io.parquet.MapredParquetOutputFormat   
Compressed:             No                       
Num Buckets:            -1                       
Bucket Columns:         []                       
Sort Columns:           []                       
Storage Desc Params:         
    path                    file:/user/hive/warehouse/test1
    serialization.format    1                   
Time taken: 0.423 seconds, Fetched: 30 row(s)

hive> describe formatted test2;
OK

# col_name              data_type               comment

id                      bigint                                      

# Detailed Table Information

Database:               default                  
Owner:                  root                     
CreateTime:             Fri Mar 10 16:02:07 UTC 2017     
LastAccessTime:         UNKNOWN                  
Protect Mode:           None                     
Retention:              0                        
Location:               file:/hive/warehouse/test2   
Table Type:             EXTERNAL_TABLE           
Table Parameters:        
    COLUMN_STATS_ACCURATE   false               
    EXTERNAL                TRUE                
    numFiles                2                   
    numRows                 -1                  
    rawDataSize             -1                  
    spark.sql.sources.provider  parquet             
    spark.sql.sources.schema.numParts   1                   
    spark.sql.sources.schema.part.0 {\"type\":\"struct\",\"fields\":[{\"name\":\"id\",\"type\":\"long\",\"nullable\":true,\"metadata\":{}}]}
    totalSize               4755                
    transient_lastDdlTime   1489161727          

# Storage Information

SerDe Library:          org.apache.hadoop.hive.ql.io.parquet.serde.ParquetHiveSerDe  
InputFormat:            org.apache.hadoop.hive.ql.io.parquet.MapredParquetInputFormat    
OutputFormat:           org.apache.hadoop.hive.ql.io.parquet.MapredParquetOutputFormat   
Compressed:             No                       
Num Buckets:            -1                       
Bucket Columns:         []                       
Sort Columns:           []                       
Storage Desc Params:         
    path                    file:/hive/warehouse/test2
    serialization.format    1                   
Time taken: 0.402 seconds, Fetched: 36 row(s)
gc0ot86w

gc0ot86w1#

配置单元.metastore.warehouse.dir
默认值:/user/hive/warehouse
添加到:配置单元0.2.0
仓库的默认数据库的位置。
https://cwiki.apache.org/confluence/display/hive/configuration+properties

k5hmc34c

k5hmc34c2#

解决了这个问题。对于有类似问题的人,我会发布我的修复。
只有在将表保存到默认配置单元数据库时,才会出现“path”参数不正确的问题(如下所示)。这让我想到,可能“旧”数据库正在使用旧的配置值(hive.metastore.warehouse.dir),而新数据库正在使用新的值。
因此,修复方法是删除默认数据库,重新创建数据库,现在在hive metastore中创建的所有数据库都将使用正确的值hive.metastore.warehouse.dir。

spark.sql("create database testdb")
spark.sql("use testdb")
df.write.saveAsTable("test3")

hive> describe formatted test.test3;
OK

# col_name              data_type               comment

id                      bigint                                      

# Detailed Table Information

Database:               testdb                   
Owner:                  root                     
CreateTime:             Fri Mar 10 22:10:10 UTC 2017     
LastAccessTime:         UNKNOWN                  
Protect Mode:           None                     
Retention:              0                        
Location:               file:/hive/warehouse/test.db/test3   
Table Type:             MANAGED_TABLE            
Table Parameters:        
    COLUMN_STATS_ACCURATE   false               
    numFiles                1                   
    numRows                 -1                  
    rawDataSize             -1                  
    spark.sql.sources.provider  parquet             
    spark.sql.sources.schema.numParts   1                   
    spark.sql.sources.schema.part.0 {\"type\":\"struct\",\"fields\":[{\"name\":\"id\",\"type\":\"long\",\"nullable\":true,\"metadata\":{}}]}
    totalSize               409                 
    transient_lastDdlTime   1489183810          

# Storage Information

SerDe Library:          org.apache.hadoop.hive.ql.io.parquet.serde.ParquetHiveSerDe  
InputFormat:            org.apache.hadoop.hive.ql.io.parquet.MapredParquetInputFormat    
OutputFormat:           org.apache.hadoop.hive.ql.io.parquet.MapredParquetOutputFormat   
Compressed:             No                       
Num Buckets:            -1                       
Bucket Columns:         []                       
Sort Columns:           []                       
Storage Desc Params:         
    path                    file:/hive/warehouse/test.db/test3
    serialization.format    1                   
Time taken: 0.243 seconds, Fetched: 35 row(s)

相关问题