this post was submitted on 20 Mar 2026
-13 points (43.4% liked)

Selfhosted

57715 readers
650 users here now

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.

Rules:

  1. Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don't duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.

  7. No low-effort posts. This is subjective and will largely be determined by the community member reports.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

founded 2 years ago
MODERATORS
 

I Built a Python script that uses a local Ollama LLM to automatically find and add movies to Radarr.

It picks random films from your library, asks Ollama for similar suggestions based on theme and atmosphere, validates against OMDb, scores with plot embeddings, then adds the top results to Radarr automatically.

Examples:

  • Whiplash → La La Land, Birdman, All That Jazz
  • The Thing → In the Mouth of Madness, It Follows, The Descent
  • In Bruges → Seven Psychopaths, Dead Man's Shoes

Features:

  • 100% local, no external AI API
  • --auto mode for daily cron/Task Scheduler
  • --genre "Horror" for themed movie nights
  • Persistent blacklist, configurable quality profile
  • Works on Windows, Linux, Mac

GitHub: https://github.com/nikodindon/radarr-movie-recommender

you are viewing a single comment's thread
view the rest of the comments
[–] pfr@piefed.social 4 points 20 hours ago (1 children)

Anti-AI evangelism is at its peak rn.

[–] Andres4NY@social.ridetrans.it 2 points 14 hours ago* (last edited 12 hours ago) (1 children)

@pfr @nikodindon That assumes it won't get worse, which I hope it does. AI companies have forced me to take down web stuff that I had running for almost 2 decades, because their scrapers are so aggressive.

[–] meldrik@lemmy.wtf 1 points 14 hours ago (1 children)

Like what and what have you tried to block it?

[–] Andres4NY@social.ridetrans.it 1 points 14 hours ago (1 children)

@meldrik They're impossible to block based on IP ranges alone. It's why all the FOSS git forges and bug trackers have started using stuff like anubis. But yes, I initially tried to block them (this was before anubis existed).

It was a few things that I had to take down; a gitweb instance with some of my own repos, for example. And a personal photo gallery. The scrapers would do pathological things like running repeated search queries for random email addresses or strings.

[–] meldrik@lemmy.wtf 1 points 14 hours ago

I’m hosting several things, including Lemmy and PeerTube. I haven’t really been aware of any scrapers, but do you know of any software that can help block it?