this post was submitted on 31 Dec 2025
19 points (88.0% liked)

Python

7727 readers
41 users here now

Welcome to the Python community on the programming.dev Lemmy instance!

📅 Events

PastNovember 2023

October 2023

July 2023

August 2023

September 2023

🐍 Python project:
💓 Python Community:
✨ Python Ecosystem:
🌌 Fediverse
Communities
Projects
Feeds

founded 2 years ago
MODERATORS
top 13 comments
sorted by: hot top controversial new old
[–] rimu@piefed.social 4 points 1 month ago (3 children)

Wow, very interesting to see Sync doing so well in comparison!

But I thought a lot of the advantage of async is it's ability to handle many concurrent requests? What happens if you run the same tests but this time with 100 requests in parallel?

[–] solrize@lemmy.ml 2 points 1 month ago (2 children)

I've run python progs with 1000+ threads and it's fine. 10k might be harder. Async is overrated. Erlang and elixir are preferable.

[–] logging_strict@programming.dev 1 points 1 month ago

overrated or not the choice is between sync or async drivers. Actually there is no choice, just an illusion of choice.

So async or async ... choose. Without the web router running multithreaded, concurrency will have minimal effect. But everything is step by step. freethreaded hasn't been out for very long.

[–] hackeryarn@lemmy.world 0 points 1 month ago* (last edited 1 month ago)

Only reliable web server is an Erlang web server.

[–] hackeryarn@lemmy.world 1 points 1 month ago (1 children)

This is running with concurrent requests. 64 workers firing request to be exact.

[–] rimu@piefed.social 1 points 1 month ago

That sounds like plenty. Cool!

[–] Ephera@lemmy.ml 1 points 1 month ago (1 children)

I mean, it's a bit of a weird comparison to begin with. Web servers were always able to parallelize without async, because they'd just spawn a new thread per request. The real advantage of async is programs where working with threads isn't as trivial...

[–] hackeryarn@lemmy.world 2 points 1 month ago

I totally agree with you. This article was really a response to a lot of hype around async web servers in Python.

I kind of knew what to expect, but wanted to throw real numbers against it. I was surprised to see a 10x slowdown with the async switch in Django.

[–] sherbang@chaos.social 1 points 1 month ago (2 children)

@hackeryarn It's not clear from this writeup how SQLAlchemy is set up. If you're using a sync postgres driver then you're doing async-to-sync in your code and not testing what you think you're testing.

A test of different async SQLAlchemy configurations would be helpful next to this. Including testing that the SQLAlchemy setup is async all the way through.

[–] logging_strict@programming.dev 2 points 1 month ago

I live and breathe this stuff

SQLAlchemy AsyncSession calls greenlet_spawn which wraps a Session sync method. For async dialect+driver, the sqlalchemy dbapi driver will make async connections.

Hey lets make an async call. You mean rewrite the exact code except with dispersed await sprinkled about? Fuck that! Once is enough. Instead wrap the sync call in a greenlet_spawn. And then return to the gulag of static type checking hell forever.

So is it async all the way thru? No. It's async enough™

[–] hackeryarn@lemmy.world 2 points 1 month ago

It is using the async driver. I am using FastAPI’s thin wrapper around SQLAlchemy which also does some slight tuning for it to work better with FastAPI in an async mode.

[–] loweffortname@lemmy.blahaj.zone 0 points 1 month ago (1 children)

It wasn't clear whether they have a connection proxy in front of the postgres instance. PosrgreSQL connections are expensive, so something like pg_bouncer could also make a big difference here.

(I realize the point was to test python web servers, but it would have been an interesting additional metric.)

[–] hackeryarn@lemmy.world 2 points 1 month ago

No connection proxy in this case. The pooled sync test uses client side pooling which shows better performance. Using a proxy would have the same effect, just moves the pooling to server side.