过期查询和appengin

2024-09-25 00:34:31 发布

您现在位置:Python中文网/ 问答频道 /正文

在一个任务中,我用一个查询迭代一组项。在从查询中获取每个实体之后,我还执行一个URL请求。在迭代了大量这些项之后,我看到以下错误:

BadRequestError: The requested query has expired. Please restart it with the last cursor to read more results.

创建查询后,它的租约是什么?在


Tags: the实体url错误withitquerylast
3条回答

简单地为序列中的每个元素创建延迟任务。这里有一篇很好的文章,里面有一些例子说明如何正确地处理“Background work with the deferred library”。在

我编写了一个简单的helper来实现这一点—您可以使用batch_size、查询的对象类和处理查询中元素的回调来调用它。在

(注意,我使用的是djangoappengine,因此使用的是django查询格式,但是您可以根据需要修改它。)

def loop_over_objects_in_batches(batch_size, object_class, callback):
    logging.info("Calling batched loop with batch_size: %d, object_class: %s, callback: %s" % (batch_size, object_class, callback))

    num_els = object_class.objects.all().count()
    num_loops = num_els / batch_size
    remainder = num_els - num_loops * batch_size
    offset = 0
    while offset < num_loops * batch_size:
        logging.info("Processing batch (%d:%d)" % (offset, offset+batch_size))
        query = object_class.objects.all()[offset:offset + batch_size]
        for q in query:
            callback(q)

        offset = offset + batch_size

    if remainder:
        logging.info("Processing remainder batch (%d:-)" % offset)
        query = object_class.objects.all()[offset:]
        for q in query:
            callback(q)

此问题可能会对您的问题有所启示:https://code.google.com/p/googleappengine/issues/detail?id=4432

Even though offline requests can currently live up to 10 minutes (and background instances can live forever) datastore queries can still only live for 30 seconds. We plan to improve this, but since a 'consistent' view of the data is only preserved for a limit period of time, there is an upper bound to how long a query can last (which is < 10 minutes).

。。。在

Instead of running a single long query, consider fetching batches from the query using query cursors.

相关问题 更多 >