this post was submitted on 13 Mar 2026
307 points (98.7% liked)

Technology

82621 readers
2784 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Upgrayedd1776@sh.itjust.works 1 points 8 hours ago (1 children)

the moderation and comment moderation history is nice but it is sort of hidden. that would be nicer if it was more intuitively inline. Also did you mean "bot scrapers" rather than "not scrapers". And I have been toying with the idea, what about ai or bot supported pipelines, accept that they are part of the visitors, provide it optimized data, and then restrict ui/ux heavy processes to stricter rate limits per second, etc...sort of like a robots txt v2 or something

[–] ProdigalFrog@slrpnk.net 2 points 5 hours ago

It's less hidden for admins. Whenever we click on a user's profile, there's a button that can take us directly to their individual mod-log history, which can help us quickly see if there's a history of AI or bot spam behavior that was spotted by other admins/mods.

Also did you mean “bot scrapers”

Agh, auto correct got me. Yes, I meant bot scrapers.

As for the robots.txt v2 idea, I'm not sure that would be very effective or popular amongst admins, as I think most would prefer not to give it any data if possible, and even with reduced rates, there would probably still be enough bots and queries to present a significant issue.

I think there have been some experiments with 'trapping' bots in a recursive loop that they don't realize is a loop, and therefore can't escape, but I'm not sure how effective those have been.