object sparkhadooputil

mzmfm0qo  于 2021-07-09  发布在  Spark
关注(0)|答案(1)|浏览(352)

为什么sparkhadooputil在这里是不可访问的,而在spark的较低版本中是可访问的,即使它们是进口的?

Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 3.0.2
      /_/

Using Scala version 2.12.10 (OpenJDK 64-Bit Server VM, Java 1.8.0_282)
Type in expressions to have them evaluated.
Type :help for more information.

scala> import org.apache.spark.deploy.SparkHadoopUtil
import org.apache.spark.deploy.SparkHadoopUtil

scala> import org.apache.hadoop.conf.Configuration
import org.apache.hadoop.conf.Configuration

scala> 

scala> 

scala>  val hadoopConf: Configuration = SparkHadoopUtil.get.conf
<console>:25: error: object SparkHadoopUtil in package deploy cannot be accessed in package org.apache.spark.deploy
        val hadoopConf: Configuration = SparkHadoopUtil.get.conf
                                        ^

scala>
r1zk6ea1

r1zk6ea11#

那是因为 SparkHadoopUtil 在spark 3中,类已更改为私有类。以下是spark 2.4和spark 3.0的不同之处。
Spark2.4:

@DeveloperApi
class SparkHadoopUtil extends Logging {

spark 3.0版:

private[spark] class SparkHadoopUtil extends Logging {

相关问题