> Non-blocking HTTP request is the bread and butter use case for asyncio
And the amount of contorting that has to be done for it in Python would be hilarious if it weren't so sad.
> Most JS projects
I don't know what JavaScript does, but I do know that Python is not JavaScript.
> You want to manage your own thread pool for this...
In Python, concurrent futures' ThreadPoolExecutor is actually nice to use and doesn't require rewriting existing worker code. It's already done, has a clean interface, and was part of the standard library before asyncio was.
ThreadPoolExecutor is the most similar thing to asyncio: It hands out promises, and when you call .result(), it's the same as await. JS even made its own promises implicitly compatible with async/await. I'm mentioning what JS does because you're describing a very common JS use case, and Python isn't all that different.
If you have async stuff happening all over the place, what do you use, a global ThreadPoolExecutor? It's not bad, but a bit more cumbersome and probably less efficient. You're running multiple OS threads that are locking, vs a single-threaded event loop. Gets worse the more long-running blocking calls there are.
Also, I was originally asking about free threads. GIL isn't a problem if you're just waiting on I/O. If you want to compute on multiple cores at once, there's multiprocessing, or more likely you're using stuff like numpy that uses C threads anyway.
Again, Python's implementation of asyncio does not allow you to background worker code without explicitly altering that worker code to be aware of asyncio. Threads do. They just don't occupy the same space.
> Also, I was originally asking about free threads...there's multiprocessing
Eh, the obvious reason to not want to use separate processes is a desire for some kind of shared state without the cost or burden of IPC. The fact that you suggested multiprocessing.Pool instead of concurrent_futures.ProcessPoolExecutor and asked about manual pool management feels like it tells me a little bit about where your head is at here wrt Python.
Basically true in JS too. You're not supposed to do blocking calls in async code. You also can't "await" an async call inside a non-async func, though you could fire-and-forget it.
Right, but how often does a Python program have complex shared state across threads, rather than some simple fan-out-fan-in, and also need to take advantage of multiple cores?
The primary thing that tripped me up about async/await, specifically only in Python, is that the called function does not begin running until you await it. Before that moment, it's just an unstarted generator.
To make background jobs, I've used the class-based version to start a thread, then the magic method that's called on await simply joins the thread. Which is a lot of boilerplate to get a little closer to how async works in (at least) js and c#.
Rust's version of async/await is the same in that respect, where futures don't do anything until you poll them (e.g., by awaiting them): if you want something to just start right away, you have to call out to the executor you're using, and get it to spawn a new task for it.
Though to be fair, people complain about this in Rust as well. I can't comment much on it myself, since I haven't had any need for concurrent workloads that Rayon (a basic thread-pool library with work stealing) can't handle.
That is a common split in language design decisions. I think the argument for the python-style where you have to drive it to begin is more useful as you can always just start it immediately but also let's you delay computation or pass it around similar to a Haskell thunk.
I feel you. I know asyncio is "the future", but I usually just want to write a background task, and really hate all the gymnastics I have to do with the color of my functions.
I feel like "asyncio is the future" was invented by the same people who think it's totally normal to switch to a new javascript web framework every 6 months.
JS had an event loop since the start. It's an old concept that Python seems to have lifted, as did Rust. I used Python for a decade and never really liked the way it did threads.
Python's reactor pattern, or event loop as you call it, started with the "Twisted" framework or library. And that was first published in 2003. That's a full 6 years before Node.js was released which I assume was the first time anything event-loopy started happening in the JS world.
I forgot to mention that it came into prominence in the Python world through the Tornado http server library that did the same thing. Slowly over time, more and more language features were added to give native or first-class-citizen support to what a lot of people were doing behind the scenes (in sometimes very contrived abuses of generator functions).
And the amount of contorting that has to be done for it in Python would be hilarious if it weren't so sad.
> Most JS projects
I don't know what JavaScript does, but I do know that Python is not JavaScript.
> You want to manage your own thread pool for this...
In Python, concurrent futures' ThreadPoolExecutor is actually nice to use and doesn't require rewriting existing worker code. It's already done, has a clean interface, and was part of the standard library before asyncio was.