6

when the service endpoint only takes in 1 item to update at a time

fml

Comments
  • 1
    Me when I'm getting paid by the request:
  • 1
    These are the times when I feel like REST was naiive for a methodology that claimed to be built on 10 years of experience maintaining legacy software.
  • 1
    Then again, there's probably nothing useful to be said about batching protocols on general, they're pretty much all unique to the level of regularity in, and frequency and nature of changes to your data.
  • 2
    @lorentz xmlrpc does support for batching since the year 2000, xmlrpc is a bit older than rest but the it supports multicall since the year rest was officially a thing. So, there were people who actually thought about this already.

    OpenAI supports batching of json in a jsonl format. jsonl just means object per line. So {} {} {} is valid, doesn't have to be in a list or such.
  • 1
    If the endpoint only supports one request a time.

    async with aiohttp.ClientSession() as session:

    tasks = []

    for x in range(1337):

    tasks.append(client.get("http://your-endpoint"))

    results = await asyncio.gather(*tasks)

    Takketak, Prrrrrrrrrrrrt
  • 1
    The idiot that said "One million requests with asyncio!" did not mention that it took his computer the whole weekend to do such requests. I already didn't believe it at the beginning, you can't even open that much files without updating some ulimit. Never fell for it.
Add Comment