rdd split提供了缺少的参数类型

hm2xizp9  于 2021-06-01  发布在  Hadoop
关注(0)|答案(1)|浏览(420)

我正在尝试拆分一个从df创建的rdd。不知道为什么会出错。
不是写每个列名,但是sql包含所有列名。所以,sql没有问题。

val df = sql("SELECT col1, col2, col3,... from tableName")
rddF = df.toJavaRDD

rddFtake(1)
res46: Array[org.apache.spark.sql.Row] = Array([2017-02-26,100102-AF,100134402,119855,1004445,0.0000,0.0000,-3.3,0.0000,0.0000,0.0000,0.0000,0.0000,0.0000,0.0000,0.0000,0.0000,0.0000,0.0000,0.0000,0.0000,0.0000,0.0000,0.0000,0.0000]

scala> rddF.map(x => x.split(","))
<console>:31: error: missing parameter type
       rdd3.map(x => x.split(","))

你知道这个错误吗?我在用 Spark 2.2.0

oxcyiej7

oxcyiej71#

rddFan Array of Row 正如你在书中看到的 res46: Array[org.apache.spark.sql.Row] 但你不能 splitRow 当你分开琴弦的时候
你可以这样做

val df = sql("SELECT col1, col2, col3,... from tableName")
val rddF = dff.rdd

rddF.map(x => (x.getAs("col1"), x.getAs[String]("col2"), x.get(2)))

相关问题