如何停止在spark控制台上显示信息消息?

but5z9lq  于 2021-05-27  发布在  Spark
关注(0)|答案(17)|浏览(488)

我想阻止spark shell上的各种消息。
我试着编辑 log4j.properties 文件以停止这些消息。
以下是 log4j.properties ```

Define the root logger with appender file

log4j.rootCategory=WARN, console
log4j.appender.console=org.apache.log4j.ConsoleAppender
log4j.appender.console.target=System.err
log4j.appender.console.layout=org.apache.log4j.PatternLayout
log4j.appender.console.layout.ConversionPattern=%d{yy/MM/dd HH:mm:ss} %p %c{1}: %m%n

Settings to quiet third party logs that are too verbose

log4j.logger.org.eclipse.jetty=WARN
log4j.logger.org.eclipse.jetty.util.component.AbstractLifeCycle=ERROR
log4j.logger.org.apache.spark.repl.SparkIMain$exprTyper=INFO
log4j.logger.org.apache.spark.repl.SparkILoop$SparkILoopInterpreter=INFO

但控制台上仍会显示消息。
下面是一些示例消息

15/01/05 15:11:45 INFO SparkEnv: Registering BlockManagerMaster
15/01/05 15:11:45 INFO DiskBlockManager: Created local directory at /tmp/spark-local-20150105151145-b1ba
15/01/05 15:11:45 INFO MemoryStore: MemoryStore started with capacity 0.0 B.
15/01/05 15:11:45 INFO ConnectionManager: Bound socket to port 44728 with id = ConnectionManagerId(192.168.100.85,44728)
15/01/05 15:11:45 INFO BlockManagerMaster: Trying to register BlockManager
15/01/05 15:11:45 INFO BlockManagerMasterActor$BlockManagerInfo: Registering block manager 192.168.100.85:44728 with 0.0 B RAM
15/01/05 15:11:45 INFO BlockManagerMaster: Registered BlockManager
15/01/05 15:11:45 INFO HttpServer: Starting HTTP Server
15/01/05 15:11:45 INFO HttpBroadcast: Broadcast server star

我该怎么阻止这些?
aamkag61

aamkag6116#

就在开始之后 spark-shell 类型;

sc.setLogLevel("ERROR")

在spark 2.0(scala)中:

spark = SparkSession.builder.getOrCreate()
spark.sparkContext.setLogLevel("ERROR")

api文件:https://spark.apache.org/docs/2.2.0/api/scala/index.html#org.apache.spark.sql.sparksession
对于java:

spark = SparkSession.builder.getOrCreate();
spark.sparkContext().setLogLevel("ERROR");
hfwmuf9z

hfwmuf9z17#

我只是将这一行添加到import语句下面的所有pyspark脚本中。

SparkSession.builder.getOrCreate().sparkContext.setLogLevel("ERROR")

pyspark脚本的示例头

from pyspark.sql import SparkSession, functions as fs
SparkSession.builder.getOrCreate().sparkContext.setLogLevel("ERROR")

相关问题