sparkDataframe插入到hbase错误

roejwanj  于 2021-06-08  发布在  Hbase
关注(0)|答案(0)|浏览(180)

我有一个具有此模式的Dataframe:

|-- Name1: string (nullable = true)
 |-- Name2: string (nullable = true)
 |-- App: string (nullable = true)
...
 |-- Duration: float (nullable = false)

我想把它插入hbase表。我用了很多参考资料。我定义目录:

def catalog = s"""{
       |"table":{"namespace":"default", "name":"otarie"},
       |"rowkey":"key",
       |"columns":{
         |"col0":{"cf":"rowkey", "col":"key", "type":"string"},
         |"col1":{"cf":"cf1", "col":"Name1", "type":"boolean"},
         |"col2":{"cf":"cf2", "col":"Name2", "type":"double"},
         |"col3":{"cf":"cf3", "col":"App", "type":"float"},
            ........
         |"co27":{"cf":"cf27", "col":"Duration", "type":"string"}
       |}
     |}""".stripMargin

然后我试着写下我的Dataframe:

Append_Ot.write.options(Map(HBaseTableCatalog.tableCatalog -> catalog, HBaseTableCatalog.newTable -> "5")).format("org.apache.hadoop.hbase.spark ").save()

我使用的是spark shell,我得到了一个错误:

<console>:155: error: not found: value HBaseTableCatalog

暂无答案!

目前还没有任何答案,快来回答吧!

相关问题