I have this task:
It is necessary to make server requests to different url in parallel.
That is:

Stream 1:
while 1:
    resp=requests.get("http://site.ru/path1")


Stream 2:
while 1:
    resp=requests.get("http://site.ru/path2")


....

There are about 25 different links.How is it best implemented? asyncio?

3 Answers 3

import asyncio
import requests
from mysql import MySql
import concurrent.futures

def get_data(id):
    mysql=MySql()

    while true:
        resp=requests.get("http://site.ru/path?id=" + id)
        mysql.insert_data(resp)

    return 1


async def main():
    with concurrent.futures.ThreadPoolExecutor(max_workers=20) as executor:
        ids=[1,2,3,4,5,..., N];

        loop=asyncio.get_event_loop()
        futures=[
            loop.run_in_executor(
                executor,
                get_data,
                id
            )
            for id in ids
        ]
        for response in await asyncio.gather(* futures):
            pass


loop=asyncio.get_event_loop()
loop.run_until_complete(main())
  • requests does not work through asyncio, the example is meaningless, you need to throw out either request or asyncio – Magnetic59 Jan 26 '17 at 14:22
  • Magnetic59: requests does not work, but the fact that in my class Mysql works asynchronously.So the example makes sense. – Sim31 Jan 26 '17 at 23:24
For 25 links, you can do it on threads, it can even work faster if the domains are different.
asyncio is if something more serious or just hunting to dig in asyncio
Did I build the parser correctly?