新的日志记录包(RotatingFileHandler和timedrotingfilehandler) 没有按预期工作。你知道吗
问题:它创建的文件数量与 未完成最大字节限制的用户(备份计数)。你知道吗
即使用户定义了 maxBytes(文件大小可能为10kb或34B等)
它把日志发送到任何文件的任何地方,这太随机了。
文件1:#用于添加到队列中
import redis
from newTesting1 import executeJobQueue
import json
import time
from apscheduler.schedulers.background import BackgroundScheduler
conn = redis.Redis('127.0.0.1')
def addInQueue(queueName, PartnerId, CompanyName):
newQ = Queue(queueName, connection=conn)
for i in range(1000):
newQ.enqueue(executeJobQueue, str(json.dumps({"ObjectId":"",
"DisplayName":"", "FirstName":"", "LastName":"",
"UserPrincipalName":"", "IsLicensed":"",
"Licenses":"", "City":""})))
if __name__ == "__main__":
scheduler = BackgroundScheduler()
scheduler.start()
scheduler.add_job(addInQueue, 'cron', ["syncDirectoryQueue", None, None], month='*', day='*', hour='10', minute='12')
while True:
time.sleep(3000)
文件2:从队列处理
from rq import Worker, Queue, Connection
import redis
conn = redis.Redis('127.0.0.1')
import logging.handlers
import logging
logger = logging.getLogger('MyLogger')
logger.setLevel(logging.DEBUG)
handler = logging.handlers.RotatingFileHandler("/Users/SMishra/log/application", maxBytes=100000, backupCount=10)
logger.addHandler(handler)
def executeJobQueue(newEntry):
rq = Queue('syncDirectoryQueue', connection=conn)
logger.info("executing queue %s %s", newEntry, len(rq))
logger.info("executing queue %s %s", newEntry, len(rq))
logger.info("executing queue %s %s", newEntry, len(rq))
logger.info("executing queue %s %s", newEntry, len(rq))
if __name__ == "__main__":
with Connection(conn):
worker = Worker(list(map(Queue, ['syncDirectoryQueue'])))
worker.work()
输出:
目前没有回答
相关问题 更多 >
编程相关推荐