Making Concurrent Requests with aiohttp in Python

Mar 3, 2024 ยท 2 min read

When building applications with aiohttp in Python, it's common to need to make multiple requests concurrently rather than sequentially. There are a few ways to achieve this while avoiding some common pitfalls.

Use asyncio.gather

The easiest way is with asyncio.gather, which allows you to kick off multiple coroutines in parallel and wait for them all to complete:

import asyncio
import aiohttp

async def fetch(url):
    async with aiohttp.ClientSession() as session:
        async with session.get(url) as response:
            return await response.text()

urls = ['https://example.com/1', 'https://example.com/2']

async def main():
    results = await asyncio.gather(*[fetch(url) for url in urls])
    print(results)

asyncio.run(main())

This fires off all the fetch coroutines at once and collects the responses once they've all completed.

Reuse session

When making multiple requests, it's better to reuse a single ClientSession instance rather than creating a new one per request. This allows connection pooling and session reuse under the hood for better performance:

async with aiohttp.ClientSession() as session:
    response1 = await session.get(url1)
    response2 = await session.get(url2)

Avoid limits

If you have a large number of URLs, don't try to kick them all off at once! There are connection limits per domain, so you could hit throttling or errors. Use asyncio.Semaphore to limit concurrency:

semaphore = asyncio.Semaphore(10) 

async def fetch(url):
    async with semaphore:
        # rest of request code

This allows 10 requests at a time. Tune based on target sites.

In summary, aiohttp and asyncio provide great tools for concurrent requests, but take care to reuse sessions and limit concurrency. Happy fetching!

Browse by tags:

Browse by language:

The easiest way to do Web Scraping

Get HTML from any page with a simple API call. We handle proxy rotation, browser identities, automatic retries, CAPTCHAs, JavaScript rendering, etc automatically for you


Try ProxiesAPI for free

curl "http://api.proxiesapi.com/?key=API_KEY&url=https://example.com"

<!doctype html>
<html>
<head>
    <title>Example Domain</title>
    <meta charset="utf-8" />
    <meta http-equiv="Content-type" content="text/html; charset=utf-8" />
    <meta name="viewport" content="width=device-width, initial-scale=1" />
...

X

Don't leave just yet!

Enter your email below to claim your free API key: