通过更改www.example.com文件在Spark中禁止INFO-loglog4j.properties

yv5phkfx  于 2022-11-06  发布在  Spark
关注(0)|答案(1)|浏览(128)

我读到,为了抑制Spark中过多的INFO-log消息,我需要更改行

log4j.rootCategory=INFO, console

log4j.rootCategory=ERROR, console

在我的网站www.example.com上log4j.properties,我在

/usr/本地/Cellar/apache-spark/3.3.0/libexec/配置文件/log4j.properties

然而,我的文件的结构似乎与我从其他用户那里找到的不同:


# Licensed to the Apache Software Foundation (ASF) under one or more

# contributor license agreements.  See the NOTICE file distributed with

# this work for additional information regarding copyright ownership.

# The ASF licenses this file to You under the Apache License, Version 2.0

# (the "License"); you may not use this file except in compliance with

# the License.  You may obtain a copy of the License at

# 

# http://www.apache.org/licenses/LICENSE-2.0

# 

# Unless required by applicable law or agreed to in writing, software

# distributed under the License is distributed on an "AS IS" BASIS,

# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.

# See the License for the specific language governing permissions and

# limitations under the License.

# 

log4j.logger.org.apache.spark.api.python.PythonGatewayServer=error

# Set everything to be logged to the console

rootLogger.level = error
rootLogger.appenderRef.stdout.ref = console

# In the pattern layout configuration below, we specify an explicit `%ex` conversion

# pattern for logging Throwables. If this was omitted, then (by default) Log4J would

# implicitly add an `%xEx` conversion pattern which logs stacktraces with additional

# class packaging information. That extra information can sometimes add a substantial

# performance overhead, so we disable it in our default logging config.

# For more information, see SPARK-39361.

appender.console.type = Console
appender.console.name = console
appender.console.target = SYSTEM_ERR
appender.console.layout.type = PatternLayout
appender.console.layout.pattern = %d{yy/MM/dd HH:mm:ss} %p %c{1}: %m%n%ex

# Set the default spark-shell/spark-sql log level to WARN. When running the

# spark-shell/spark-sql, the log level for these classes is used to overwrite

# the root logger's log level, so that the user can have different defaults

# for the shell and regular Spark apps.

logger.repl.name = org.apache.spark.repl.Main
logger.repl.level = warn

logger.thriftserver.name = org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver
logger.thriftserver.level = warn

# Settings to quiet third party logs that are too verbose

logger.jetty1.name = org.sparkproject.jetty
logger.jetty1.level = warn
logger.jetty2.name = org.sparkproject.jetty.util.component.AbstractLifeCycle
logger.jetty2.level = error
logger.replexprTyper.name = org.apache.spark.repl.SparkIMain$exprTyper
logger.replexprTyper.level = info
logger.replSparkILoopInterpreter.name = org.apache.spark.repl.SparkILoop$SparkILoopInterpreter
logger.replSparkILoopInterpreter.level = info
logger.parquet1.name = org.apache.parquet
logger.parquet1.level = error
logger.parquet2.name = parquet
logger.parquet2.level = error

# SPARK-9183: Settings to avoid annoying messages when looking up nonexistent UDFs in SparkSQL with Hive support

logger.RetryingHMSHandler.name = org.apache.hadoop.hive.metastore.RetryingHMSHandler
logger.RetryingHMSHandler.level = fatal
logger.FunctionRegistry.name = org.apache.hadoop.hive.ql.exec.FunctionRegistry
logger.FunctionRegistry.level = error

# For deploying Spark ThriftServer

# SPARK-34128: Suppress undesirable TTransportException warnings involved in THRIFT-4805

appender.console.filter.1.type = RegexFilter
appender.console.filter.1.regex = .*Thrift error occurred during processing of message.*
appender.console.filter.1.onMatch = deny
appender.console.filter.1.onMismatch = neutral

所以我只是在脚本中添加了这行代码,但这并没有改变什么。
当我在我的mac终端中键入spark-submit somepythonscript.py时,在下面输出的开头,我看到:

Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties

不幸的是,我甚至不知道“org”文件夹在哪里。这很令人困惑,因为如上所述,我的属性文件夹完全在其他地方。
我进一步了解到here应该加上
如果您有任何问题,请联系我们。如果您有问题,请联系我们log4j.properties
到我的spark-submit。在整个这将是在我的情况下:
如果您有任何问题,请somepythonscript.py任何问题,请联系我们log4j.properties
然而,这并没有做任何事情。脚本工作,但信息日志仍然存在。我做错了什么?

ffdz8vbo

ffdz8vbo1#

当我在本地模式下运行Spark时,这个答案对我很有帮助,唯一的区别是它是log4j2.properties
https://stackoverflow.com/a/52648367/11951587

相关问题