Python Logs to Logstash(ELK)问题

krugob8w  于 2023-08-01  发布在  Logstash
关注(0)|答案(1)|浏览(129)

将日志从Python发送到Logstash(ELK堆栈)时遇到问题。
我有一个像这样的ELK的docker配置:

version: '3.2'

services:
...

  logstash:
    restart: always
    container_name: logstash
    build:
      context: logstash/
      args:
        ELK_VERSION: $ELK_VERSION
    volumes:
      - type: bind
        source: ./logstash/config/logstash.yml
        target: /usr/share/logstash/config/logstash.yml
        read_only: true
      - type: bind
        source: ./logstash/pipeline
        target: /usr/share/logstash/pipeline
        read_only: true
    ports:
      - "5010:5000"
      - "5001:5001"
      - "5002:5002"
      - "5003:5003"
      - "9600:9600"
      - "5959:5959"
      - "12201:12201/udp"
    environment:
      LS_JAVA_OPTS: "-Xmx256m -Xms256m"
    networks:
      - backendnetwork
    depends_on:
      - elasticsearch

  kibana:
...

  apm-server:
....

networks:
  backendnetwork:
    driver: bridge

字符串
和一个Logstash配置:

input {
        tcp {
                port => 5000
        }
        gelf {
                port => 12201
                type => gelf
        }
        tcp {
                port => 5001
                type => "python"
                codec => "json_lines"
        }
        tcp {
                port => 5002
                type => "python2"
                codec => "json_lines"
        }
        tcp {
                port => 5003
                type => "python3"
                codec => "json_lines"
        }
    tcp {
            port => 5959
            type => "python_logs"
            codec => json
    }

}

## Add your filters / logstash plugins configuration here

output {
    if [type] == "nginx" {
        ...
    }
    else if [type] == "gelf" {
        ...
    }

    else if [type] == "python" {
       ...
    }
    else if [type] == "python_logs" {
        elasticsearch {
                hosts => "elasticsearch:9200"
                user => "elastic"
        password => "password"
                index    => "python_logs-%{+YYYY.MM.dd}"
        }
    }
    else if [type] == "python2" {
        ...
    }
    else if [type] == "python3" {
        ...
    }

    else {
        elasticsearch {
                hosts => "elasticsearch:9200"
                user => "elastic"
        password => "wnl8bFxsJZpgRsCfyejJPGJK"
                index    => "other-%{+YYYY.MM.dd}"
        }
    }
}


我将端口5959暴露为json格式,以接收来自Python的日志。它在主docker-compose中配置为从Docker容器传播5959端口。
当我使用Postman向logstash发送HTTP请求时:
Postman screenshot
然后我可以在logstash日志中看到它正在接收一条消息:

at [Source: (String)"PUT /tweeter/tweet HTTP/1.1
User-Agent: PostmanRuntime/7.29.0
Accept: */*
Postman-Token: 9520b257-94e4-437d-a57e-02bfc9394180
Host: 127.0.0.1:5959
Accept-Encoding: gzip, deflate, br
Connection: keep-alive
Content-Type: application/x-www-form-urlencoded
Content-Length: 13

message=Hello"; line: 1, column: 4]>, :data=>"PUT /tweeter/tweet HTTP/1.1\r\nUser-Agent: PostmanRuntime/7.29.0\r\nAccept: */*\r\nPostman-Token: 9520b257-94e4-437d-a57e-02bfc9394180\r\nHost: 127.0.0.1:5959\r\nAccept-Encoding: gzip, deflate, br\r\nConnection: keep-alive\r\nContent-Type: application/x-www-form-urlencoded\r\nContent-Length: 13\r\n\r\nmessage=Hello"}


跳转到Python我有这个测试代码:

import logging

import logstash

host = 'localhost'
test_logger = logging.getLogger('python-logstash-logger')
test_logger.setLevel(logging.INFO)
test_logger.addHandler(logstash.TCPLogstashHandler(host, 5959, version=1))

def run_app():
    test_logger.info('INFO: Test Log')
    test_logger.debug('DEBUG: Test Log')
    test_logger.error('ERROR: Test Log')

if __name__ == '__main__':
    run_app()


当我运行我的测试代码时,我可以看到logger已配置,handler已分配给logger:Intellij Idea Debug screenshot
然而,在我运行我的应用程序后,我看不到任何日志被推送到logstash。只有一个条目与我通过Postman推送的相关:Kibana view of index logs
你们能帮助我理解为什么Python的日志不去Logstash吗?

owfi6suc

owfi6suc1#

从您的docker-compose配置中并不清楚logstash服务和您的Python应用程序是否共享同一个网络,但可能需要在Python代码中更改主机:

host = 'logstash'

字符串

相关问题