嗨,我已经使用docker compose在本地设置了气流,我正在使用mac,气流docker版本为 apache/airflow:2.1.0
. 任务日志有很多星号,如下所示。我需要帮助来纠正它,我搜索了很多,但什么也找不到。
***Reading local file: /opt/airflow/logs/bi_sf_snowflake/git_clone/2021-07-09T11:41:24.189880+00:00/1.log
[2021-07-09 11:41:28,633] {logging_mixin.py:104} WARNING -***-***-***-******L***o***g***g***i***n***g******e***r***r***o***r******-***-***-***
[2021-07-09 11:41:28,634] {logging_mixin.py:104} WARNING -***T***r***a***c***e***b***a***c***k******(***m***o***s***t******r***e***c***e***n***t******c***a***l***l******l***a***s***t***)***:***
[2021-07-09 11:41:28,635] {logging_mixin.py:104} WARNING -*********F***i***l***e******"***/***u***s***r***/***l***o***c***a***l***/***l***i***b***/***p***y***t***h***o***n***3***.***6***/***l***o***g***g***i***n***g***/***_***_***i***n***i***t***_***_***.***p***y***"***,******l***i***n***e******9***9***4***,******i***n******e***m***i***t***
docker compose文件如下所示开始-
version: '3'
x-airflow-common:
&airflow-common
image: ${AIRFLOW_IMAGE_NAME:-mc-airflow:Dockerfile}
environment:
&airflow-common-env
AIRFLOW__CORE__EXECUTOR: CeleryExecutor
AIRFLOW__CORE__SQL_ALCHEMY_CONN: postgresql+psycopg2://airflow:airflow@postgres/airflow
AIRFLOW__CELERY__RESULT_BACKEND: db+postgresql://airflow:airflow@postgres/airflow
AIRFLOW__CELERY__BROKER_URL: redis://:@redis:6379/0
AIRFLOW__CORE__FERNET_KEY: ''
AIRFLOW__CORE__DAGS_ARE_PAUSED_AT_CREATION: 'true'
AIRFLOW__CORE__LOAD_EXAMPLES: 'true'
AIRFLOW__API__AUTH_BACKEND: 'airflow.api.auth.backend.basic_auth'
volumes:
- ./dags/:/opt/airflow/dags
- ./logs:/opt/airflow/logs
- ./plugins:/opt/airflow/plugins
user: "${AIRFLOW_UID:-50000}:${AIRFLOW_GID:-50000}"
depends_on:
redis:
condition: service_healthy
postgres:
condition: service_healthy
command: bash -c pip
气流配置有以下日志配置
logging_level = INFO
# Logging level for Flask-appbuilder UI.
#
# Supported values: ``CRITICAL``, ``ERROR``, ``WARNING``, ``INFO``, ``DEBUG``.
fab_logging_level = WARN
# Logging class
# Specify the class that will specify the logging configuration
# This class has to be on the python classpath
# Example: logging_config_class = my.path.default_local_settings.LOGGING_CONFIG
logging_config_class =
# Flag to enable/disable Colored logs in Console
# Colour the logs when the controlling terminal is a TTY.
colored_console_log = False
# Log format for when Colored logs is enabled
colored_log_format = [%%(blue)s%%(asctime)s%%(reset)s] {%%(blue)s%%(filename)s:%%(reset)s%%(lineno)d} %%(log_color)s%%(levelname)s%%(reset)s - %%(log_color)s%%(message)s%%(reset)s
colored_formatter_class = airflow.utils.log.colored_log.CustomTTYColoredFormatter
# Format of Log line
log_format = [%%(asctime)s] {%%(filename)s:%%(lineno)d} %%(levelname)s - %%(message)s
simple_log_format = %%(asctime)s %%(levelname)s - %%(message)s
# Specify prefix pattern like mentioned below with stream handler TaskHandlerWithCustomFormatter
# Example: task_log_prefix_template = {ti.dag_id}-{ti.task_id}-{execution_date}-{try_number}
task_log_prefix_template =
# Formatting for how airflow generates file names/paths for each task run.
log_filename_template = {{ ti.dag_id }}/{{ ti.task_id }}/{{ ts }}/{{ try_number }}.log
# Formatting for how airflow generates file names for log
log_processor_filename_template = {{ filename }}.log
# full path of dag_processor_manager logfile
dag_processor_manager_log_location = /opt/airflow/logs/dag_processor_manager/dag_processor_manager.log
# Name of handler to read task instance logs.
# Defaults to use ``task`` handler.
task_log_reader = task
# A comma\-separated list of third-party logger names that will be configured to print messages to
# consoles\.
# Example: extra_loggers = connexion,sqlalchemy
extra_loggers =
1条答案
按热度按时间eqoofvh91#
此问题已在气流2.1.1中修复。这是由于秘密掩码错误地屏蔽了“无字符”,如果您的连接有空密码。
解决方法如下:
迁移到最新发布的气流(最佳)
禁用机密屏蔽(气流2.1.0中的新功能)
找到密码为空的连接,并将其设置为非空密码(通常对于那些连接,不使用密码值,因此可以将其设置为任意随机字符集)。
问题在于:https://github.com/apache/airflow/issues/16007