为什么在2.4.6的spark配置文档中找不到spark.shuffle.minnumpartitionstohighlycompress
以下是package.scala blow中的代码:
private[spark] val SHUFFLE_MIN_NUM_PARTS_TO_HIGHLY_COMPRESS =
ConfigBuilder("spark.shuffle.minNumPartitionsToHighlyCompress")
.internal()
.doc("Number of partitions to determine if MapStatus should use HighlyCompressedMapStatus")
.intConf
.checkValue(v => v > 0, "The value should be a positive integer.")
.createWithDefault(2000)
将来会改变吗?
暂无答案!
目前还没有任何答案,快来回答吧!