我对shell.py中的Scrapy源代码感到困惑

时间:2016-11-01 03:25:17

标签: python scrapy

我已经学习python大约一个月了,上周我正在阅读scrapy源代码。 我可以理解像蜘蛛或爬虫这样的模块。直到我在'shell.py'中发现了一些有趣的东西,我才完全糊涂了。这是代码。

def _request_deferred(request):
    """Wrap a request inside a Deferred.

    This function is harmful, do not use it until you know what you are doing.

    This returns a Deferred whose first pair of callbacks are the request
    callback and errback. The Deferred also triggers when the request
    callback/errback is executed (ie. when the request is downloaded)

    WARNING: Do not call request.replace() until after the deferred is called.
    """


    # zeal4u: what's the meaning of codes below?
    request_callback = request.callback
    request_errback = request.errback

    def _restore_callbacks(result):
        request.callback = request_callback
        request.errback = request_errback
        return result


    d = defer.Deferred()
    d.addBoth(_restore_callbacks)
    if request.callback:
        d.addCallbacks(request.callback, request.errback)

    request.callback, request.errback = d.callback, d.errback
    return d

所以关键是为什么它将'request.callback'和'request.errback'分配给本地变量'request_callback'和'request_errback',然后在下一个函数'_restore_callbacks'中将vars分配给它们?

我知道这不会毫无用处,但它的真正含义是什么以及它是如何运作的? 或者我应该阅读一些相关模块来弄清楚吗?请给我一些建议。 :)

0 个答案:

没有答案