Django向芹菜日志添加自定义记录器

2024-10-17 06:13:16 发布

您现在位置:Python中文网/ 问答频道 /正文

我在我的django应用程序中添加了一个自定义日志处理程序,用于将日志条目写入数据库。在

class DbLogHandler(logging.Handler): # Inherit from logging.Handler
    def __init__(self):
        # run the regular Handler __init__
        logging.Handler.__init__(self)
        self.entries = []
        logging.debug("*****************[DB] INIT db handler")

    def emit(self, record):
        # instantiate the model
        logging.debug("*****************[DB] called emit on db handler")
        try:
            revision_instance = getattr(record, 'revision', None)
            logEntry = MyModel(name=record.name,
                                  log_level_name=record.levelname,
                                  message = record.msg,
                                  module = record.module,
                                  func_name = record.funcName,
                                  line_no = record.lineno,
                                  exception = record.exc_text,
                                  revision = revision_instance
                                  )
            if revision_instance is None:
                return
            self.entries.append(logEntry)

        except Exception as ex:
            print(ex)
        return

    def flush(self):
        if self.entries:
            MyModel.objects.bulk_create(self.entries)
            logging.info("[+] Successfully flushed {0:d} log entries to "
                         "the DB".format(len(self.entries)))
        else:
            logging.info("[*] No log entries for DB logger")

当我直接调用一个函数时,比如说通过运行一个管理命令,处理程序的使用是正确的。然而,在生产中,切入点将是芹菜任务。我的理解是芹菜有它自己的记录机制。我试图做但无法开始工作的是将我的db处理程序添加到celery日志中。也就是说,所有芹菜日志也将被发送到DbLogHandler。在

我就是这样想的。在my_app.celery_logging.logger

^{pr2}$

然后,最后在我的task.py。在

from my_app.celery_logging.logger import task_logger
logger = task_logger(__name__)

但从这一点来看,这是一个痛苦的世界。我甚至无法描述到底发生了什么。当我启动服务器并查看celery日志输出时,我发现我的db-logger实际上正在被调用,但celery似乎失去了工作人员。在

[2015-09-18 10:30:57,158: INFO/MainProcess] [*] No log entries for DB logger
Raven is not configured (logging is disabled). Please see the documentation for more information.
2015-09-18 10:30:58,659 raven.contrib.django.client.DjangoClient INFO Raven is not configured (logging is disabled). Please see the documentation for more information.
[2015-09-18 10:30:59,155: DEBUG/MainProcess] | Worker: Preparing bootsteps.
[2015-09-18 10:30:59,157: DEBUG/MainProcess] | Worker: Building graph...
[2015-09-18 10:30:59,158: DEBUG/MainProcess] | Worker: New boot order: {Timer, Hub, Queues (intra), Pool, Autoscaler, Autoreloader, StateDB, Beat, Consumer}
[2015-09-18 10:30:59,161: DEBUG/MainProcess] | Consumer: Preparing bootsteps.
[2015-09-18 10:30:59,161: DEBUG/MainProcess] | Consumer: Building graph...
[2015-09-18 10:30:59,164: DEBUG/MainProcess] | Consumer: New boot order: {Connection, Events, Mingle, Tasks, Control, Gossip, Agent, Heart, event loop}
[2015-09-18 10:30:59,167: DEBUG/MainProcess] | Worker: Starting Hub
[2015-09-18 10:30:59,167: DEBUG/MainProcess] ^-- substep ok
[2015-09-18 10:30:59,167: DEBUG/MainProcess] | Worker: Starting Pool
[2015-09-18 10:30:59,173: DEBUG/MainProcess] ^-- substep ok
[2015-09-18 10:30:59,173: DEBUG/MainProcess] | Worker: Starting Consumer
[2015-09-18 10:30:59,174: DEBUG/MainProcess] | Consumer: Starting Connection
[2015-09-18 10:30:59,180: INFO/MainProcess] Connected to amqp://guest:**@127.0.0.1:5672//
[2015-09-18 10:30:59,180: DEBUG/MainProcess] ^-- substep ok
[2015-09-18 10:30:59,180: DEBUG/MainProcess] | Consumer: Starting Events
[2015-09-18 10:30:59,188: DEBUG/MainProcess] ^-- substep ok
[2015-09-18 10:30:59,188: DEBUG/MainProcess] | Consumer: Starting Mingle
[2015-09-18 10:30:59,188: INFO/MainProcess] mingle: searching for neighbors
[2015-09-18 10:31:00,196: INFO/MainProcess] mingle: all alone
[2015-09-18 10:31:00,196: DEBUG/MainProcess] ^-- substep ok
[2015-09-18 10:31:00,197: DEBUG/MainProcess] | Consumer: Starting Tasks
[2015-09-18 10:31:00,203: DEBUG/MainProcess] ^-- substep ok
[2015-09-18 10:31:00,204: DEBUG/MainProcess] | Consumer: Starting Control
[2015-09-18 10:31:00,207: DEBUG/MainProcess] ^-- substep ok
[2015-09-18 10:31:00,208: DEBUG/MainProcess] | Consumer: Starting Gossip
[2015-09-18 10:31:00,211: DEBUG/MainProcess] ^-- substep ok
[2015-09-18 10:31:00,211: DEBUG/MainProcess] | Consumer: Starting Heart
[2015-09-18 10:31:00,212: DEBUG/MainProcess] ^-- substep ok
[2015-09-18 10:31:00,212: DEBUG/MainProcess] | Consumer: Starting event loop
[2015-09-18 10:31:00,213: WARNING/MainProcess] celery@vagrant-base-precise-amd64 ready.
[2015-09-18 10:31:00,213: DEBUG/MainProcess] | Worker: Hub.register Pool...
[2015-09-18 10:31:00,255: ERROR/MainProcess] Unrecoverable error: WorkerLostError('Could not start worker processes',)
Traceback (most recent call last):
  File "/home/vagrant/.buildout/eggs/celery-3.1.18-py2.7.egg/celery/worker/__init__.py", line 206, in start
    self.blueprint.start(self)
  File "/home/vagrant/.buildout/eggs/celery-3.1.18-py2.7.egg/celery/bootsteps.py", line 123, in start
    step.start(parent)
  File "/home/vagrant/.buildout/eggs/celery-3.1.18-py2.7.egg/celery/bootsteps.py", line 374, in start
    return self.obj.start()
  File "/home/vagrant/.buildout/eggs/celery-3.1.18-py2.7.egg/celery/worker/consumer.py", line 278, in start
    blueprint.start(self)
  File "/home/vagrant/.buildout/eggs/celery-3.1.18-py2.7.egg/celery/bootsteps.py", line 123, in start
    step.start(parent)
  File "/home/vagrant/.buildout/eggs/celery-3.1.18-py2.7.egg/celery/worker/consumer.py", line 821, in start
    c.loop(*c.loop_args())
  File "/home/vagrant/.buildout/eggs/celery-3.1.18-py2.7.egg/celery/worker/loops.py", line 48, in asynloop
    raise WorkerLostError('Could not start worker processes')

当调用芹菜任务时,我也看不到任何日志了。在


Tags: pydebugselfconsumerlogginglineoklogger