限制并发请求数aiohttp

时间:2018-05-05 14:24:23

标签: python image python-requests python-asyncio aiohttp

我正在使用aiohttp下载图像,并想知道是否有办法限制尚未完成的打开请求的数量。这是我目前的代码:

async def get_images(url, session):

    chunk_size = 100

    # Print statement to show when a request is being made. 
    print(f'Making request to {url}')

    async with session.get(url=url) as r:
        with open('path/name.png', 'wb') as file:
            while True:
                chunk = await r.content.read(chunk_size)
                if not chunk:
                    break
                file.write(chunk)

# List of urls to get images from
urls = [...]

conn = aiohttp.TCPConnector(limit=3)
loop = asyncio.get_event_loop()
session = aiohttp.ClientSession(connector=conn, loop=loop)
loop.run_until_complete(asyncio.gather(*(get_images(url, session=session) for url in urls)))

问题是,我向每个请求发出一个打印声明,告诉我一次发出近21个请求,而不是我想要限制它的3个(即,一次图像完成下载,它可以移动到列表中的下一个URL获取)。我只是想知道我在这里做错了什么。

2 个答案:

答案 0 :(得分:5)

您的限制设置正常。你在调试时犯了错误。

正如Mikhail Gerasimov指出the comment,你将print()电话放在错误的地方 - 它必须在session.get()背景下。

为了确保限制得到尊重,我针对简单的日志服务器测试了您的代码 - 测试显示服务器正好接收您在TCPConnector中设置的连接数。这是测试:

import asyncio
import aiohttp
loop = asyncio.get_event_loop()


class SilentServer(asyncio.Protocol):
    def connection_made(self, transport):
        # We will know when the connection is actually made:
        print('SERVER |', transport.get_extra_info('peername'))


async def get_images(url, session):

    chunk_size = 100

    # This log doesn't guarantee that we will connect,
    # session.get() will freeze if you reach TCPConnector limit
    print(f'CLIENT | Making request to {url}')

    async with session.get(url=url) as r:
        while True:
            chunk = await r.content.read(chunk_size)
            if not chunk:
                break

urls = [f'http://127.0.0.1:1337/{x}' for x in range(20)]

conn = aiohttp.TCPConnector(limit=3)
session = aiohttp.ClientSession(connector=conn, loop=loop)


async def test():
    await loop.create_server(SilentServer, '127.0.0.1', 1337)
    await asyncio.gather(*(get_images(url, session=session) for url in urls))

loop.run_until_complete(test())

答案 1 :(得分:1)

asyncio.Semaphore正好解决了这个问题。

在你的情况下它会是这样的:

semaphore = asyncio.Semaphore(3)


async def get_images(url, session):

    async with semaphore:

        print(f'Making request to {url}')

        # ...

您可能还有兴趣看看这个准备运行的代码example,它演示了信号量的工作原理。