我在Spyder IDE上运行PySpark,每次都会出现以下警告:
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
22/02/15 17:05:12 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
22/02/15 17:05:29 WARN ProcfsMetricsGetter: Exception when trying to compute pagesize, as a result reporting of ProcessTree metrics is stopped
我已尝试编辑文件C:\spark\spark-3.2.1-bin-hadoop2.7\conf\log4j.properties.template
以将警告级别更改为“ERROR”,但它没有任何作用
1条答案
按热度按时间pgvzfuti1#
1.将
log4j.properties.template
重新命名为log4j.properties
1.确保
log4j.properties
在类路径内或在$SPARK_HOME/conf/
下