this post was submitted on 21 Mar 2025
1425 points (99.3% liked)

Technology

67241 readers
5242 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] digdilem@lemmy.ml 72 points 1 day ago (2 children)

Surprised at the level of negativity here. Having had my sites repeatedly DDOSed offline by Claudebot and others scraping the same damned thing over and over again, thousands of times a second, I welcome any measures to help.

[–] AWittyUsername@lemmy.world 37 points 1 day ago

I think the negativity is around the unfortunate fact that solutions like this shouldn't be necessary.

[–] dan@upvote.au 4 points 15 hours ago (1 children)

thousands of times a second

Modify your Nginx (or whatever web server you use) config to rate limit requests to dynamic pages, and cache them. For Nginx, you'd use either fastcgi_cache or proxy_cache depending on how the site is configured. Even if the pages change a lot, a cache with a short TTL (say 1 minute) can still help reduce load quite a bit while not letting them get too outdated.

Static content (and cached content) shouldn't cause issues even if requested thousands of times per second. Following best practices like pre-compressing content using gzip, Brotli, and zstd helps a lot, too :)

Of course, this advice is just for "unintentional" DDoS attacks, not intentionally malicious ones. Those are often much larger and need different protection - often some protection on the network or load balancer before it even hits the server.

[–] digdilem@lemmy.ml 1 points 10 hours ago

Already done, along with a bunch of other stuff including cloudflare WAF and rate limiting rules.

I am still annoyed that it took me over a day' of my life to finally (so far) restrict these things. And several other days to offload the problem to Cloudflare pages for sites that I previous self hosted but my rural link couldn't support.

this advice is just for “unintentional” DDoS attacks, not intentionally malicious ones.

And I don't think these high volume AI scrapes are unintentional DDOS attacks. I consider them entirely intentional. Not deliberrately malicious, but negligent to the point of criminality. (Especially in requesting the same pages again so frequently, and all of them ignoring robots.txt)