javaspark程序中不显示自定义日志级别的log4j日志

yftpprvb  于 2021-05-27  发布在  Spark
关注(0)|答案(0)|浏览(275)

我一直试图通过java代码在log4j中引入自定义日志级别。我遵循两种方法:
官方文件方法(https://logging.apache.org/log4j/2.x/manual/customloglevels.html)在这里,我在一行代码中创建了一个新级别,并这样使用它:

static final Level CUSTOM = Level.forName("CUSTOM", 350);
logger.log(CUSTOM, "Test message");

我还得到了整个自定义级别类的帮助,如本文所述
我创建了自定义日志级别类,如下所示:

public class CrunchifyLog4jLevel extends Level {

/**
 * Value of CrunchifyLog4jLevel level. This value is lesser than DEBUG_INT and higher
 * than TRACE_INT}
 */
public static final int CRUNCHIFY_INT = DEBUG_INT - 10;

/**
 * Level representing my log level
 */
public static final Level CRUNCHIFY = new CrunchifyLog4jLevel(CRUNCHIFY_INT, "CRUNCHIFY", 10);

/**
 * Constructor
 */
protected CrunchifyLog4jLevel(int arg0, String arg1, int arg2) {
    super(arg0, arg1, arg2);

}

/**
 * Checks whether logArgument is "CRUNCHIFY" level. If yes then returns
 * CRUNCHIFY}, else calls CrunchifyLog4jLevel#toLevel(String, Level) passing
 * it Level#DEBUG as the defaultLevel.
 */
public static Level toLevel(String logArgument) {
    if (logArgument != null && logArgument.toUpperCase().equals("CRUNCHIFY")) {
        return CRUNCHIFY;
    }
    return (Level) toLevel(logArgument, Level.DEBUG);
}

/**
 * Checks whether val is CrunchifyLog4jLevel#CRUNCHIFY_INT. If yes then
 * returns CrunchifyLog4jLevel#CRUNCHIFY, else calls
 * CrunchifyLog4jLevel#toLevel(int, Level) passing it Level#DEBUG as the
 * defaultLevel
 * 
 */
public static Level toLevel(int val) {
    if (val == CRUNCHIFY_INT) {
        return CRUNCHIFY;
    }
    return (Level) toLevel(val, Level.DEBUG);
}

/**
 * Checks whether val is CrunchifyLog4jLevel#CRUNCHIFY_INT. If yes
 * then returns CrunchifyLog4jLevel#CRUNCHIFY, else calls Level#toLevel(int, org.apache.log4j.Level)
 * 
 */
public static Level toLevel(int val, Level defaultLevel) {
    if (val == CRUNCHIFY_INT) {
        return CRUNCHIFY;
    }
    return Level.toLevel(val, defaultLevel);
}

/**
 * Checks whether logArgument is "CRUNCHIFY" level. If yes then returns
 * CrunchifyLog4jLevel#CRUNCHIFY, else calls
 * Level#toLevel(java.lang.String, org.apache.log4j.Level)
 * 
 */
public static Level toLevel(String logArgument, Level defaultLevel) {
    if (logArgument != null && logArgument.toUpperCase().equals("CRUNCHIFY")) {
        return CRUNCHIFY;
    }
    return Level.toLevel(logArgument, defaultLevel);
}
}

我的log4j.xml如下:

<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE log4j:configuration SYSTEM "log4j.dtd">
<log4j:configuration xmlns:log4j="http://jakarta.apache.org/log4j/"
debug="false">

<!-- FILE Appender -->
<appender name="FILE" class="org.apache.log4j.FileAppender">
    <param name="File" value="c:/crunchify.log" />
    <param name="Append" value="false" />
    <layout class="org.apache.log4j.PatternLayout">
        <param name="ConversionPattern" value="%t %-5p %c - %m%n" />
    </layout>
</appender>

<!-- CONSOLE Appender -->
<appender name="CONSOLE" class="org.apache.log4j.ConsoleAppender">
    <layout class="org.apache.log4j.PatternLayout">
        <param name="ConversionPattern" value="%d{ISO8601} %-5p [%c{1}] %m%n" />
    </layout>
</appender>

<!-- Limit Category and Specify Priority -->
<category name="kafkaExample">
<priority value="CRUNCHIFY" class="kafkaExample.CrunchifyLog4jLevel" />
    <appender-ref ref="CONSOLE" />
</category>

<!-- Setup the Root category -->
<root>
    <appender-ref ref="CONSOLE" />
</root>
</log4j:configuration>

我在java代码中使用了自定义日志级别,因此:

logger.log(CrunchifyLog4jLevel.CRUNCHIFY, "Test message");

我正在创建一个spark应用程序,其中需要打印这些自定义日志,当我在服务器上作为spark提交作业运行应用程序时,这两种方法都不起作用,即使主服务器是本地的。完整的驱动程序如下:

public class AccumulatorDriver {

private static Logger logger = LogManager.getLogger("CRUNCHIFY");

static final Level CUSTOM = Level.forName("CUSTOM", 350);

public static void main(String[] args) {

//      SparkSession spark = SparkSession.builder().appName("documentation")
//              .master("spark://ch3dr609552.express-scripts.com:7077").getOrCreate();

    SparkSession spark = SparkSession.builder().appName("documentation")
            .master("local").getOrCreate();

    StringAccumulator heightValues = new StringAccumulator();
    spark.sparkContext().register(heightValues);

    logger.info("Inside driver");

    UserDefinedFunction udf1 = udf(new AccumulatorUDF(heightValues), DataTypes.StringType);

    spark.sqlContext().udf().register("AccumulatorUDF", udf1);

    UserDefinedFunction udf2 = udf(new AccumulatorUDF2(heightValues), DataTypes.StringType);

    spark.sqlContext().udf().register("AccumulatorUDF2", udf2);

    List<Row> list = new ArrayList<Row>();
    list.add(RowFactory.create("one"));
    list.add(RowFactory.create("two"));
    list.add(RowFactory.create("three"));
    list.add(RowFactory.create("four"));
    List<org.apache.spark.sql.types.StructField> listOfStructField = new ArrayList<org.apache.spark.sql.types.StructField>();
    listOfStructField.add(DataTypes.createStructField("test", DataTypes.StringType, true));
    StructType structType = DataTypes.createStructType(listOfStructField);
    Dataset<Row> data = spark.createDataFrame(list, structType);
    data.show();

    data = data.withColumn("Test2", callUDF("AccumulatorUDF", col("test")));
    data.show();
    System.out.println("Heightvalues value: " + heightValues.value());

            data = data.withColumn("Test3", callUDF("AccumulatorUDF2", col("test")));
            System.out.println("Heightvalues value: " + heightValues.value());

//      data.show();

    logger.log(CrunchifyLog4jLevel.CRUNCHIFY, "TEst message");
//      logger.log(CUSTOM, "Heightvalues value: " + heightValues.value());
    List<String> values = heightValues.value();

    System.out.println("Size of list: " + values.size());

}
}

然而,第二种方法在我从eclipse运行时起作用。我要做的唯一改变是在下面这行:

private static Logger logger = LogManager.getLogger(AccumulatorDriver.class);

我是否必须更改spark安装的log4j.properties文件中的某些内容才能使日志显示在控制台中?我遵循这个问题并相应地更改了log4j.properties。这是我在spark中的log4j.properties文件:

log4j.rootLogger=INFO, Console_Appender, File_Appender

log4j.appender.Console_Appender=org.apache.log4j.ConsoleAppender
log4j.appender.Console_Appender.Threshold=INFO
log4j.appender.Console_Appender.Target=System.out
log4j.appender.Console_Appender.layout=org.apache.log4j.PatternLayout
log4j.appender.Console_Appender.layout.ConversionPattern=%d{yyyy-MM-dd HH:mm:ss} %-5p %c{1}:%L - %m%n

log4j.appender.File_Appender=org.apache.log4j.rolling.RollingFileAppender
log4j.appender.File_Appender.Threshold=INFO
log4j.appender.File_Appender.File=file:///opt/spark_log/spark_log.txt
log4j.appender.File_Appender.RollingPolicy=org.apache.log4j.rolling.TimeBasedRollingPolicy
log4j.appender.File_Appender.TriggeringPolicy=org.apache.log4j.rolling.SizeBasedTriggeringPolicy
log4j.appender.File_Appender.RollingPolicy.FileNamePattern=/opt/spark_log/spark_log.%d{MM-dd-yyyy}.%i.txt.gz
log4j.appender.File_Appender.RollingPolicy.ActiveFileName=/opt/spark_log/spark_log.txt
log4j.appender.File_Appender.TriggeringPolicy.MaxFileSize=1000000
log4j.appender.File_Appender.layout=org.apache.log4j.PatternLayout
log4j.appender.File_Appender.layout.ConversionPattern=%d{yyyy-MM-dd HH:mm:ss} %-5p %c - %m%n

log4j.logger.myLogger=INFO,File_Appender

log4j.category.kafkaExample=INFO,kafkaExample.CrunchifyLog4jLevel

请帮助我显示自定义日志级别,即使我正在通过spark submit运行spark作业。是因为我给自定义日志级别350的级别号吗?只是低于400的信息量。但我也尝试了550,得到了同样的结果。

暂无答案!

目前还没有任何答案,快来回答吧!

相关问题