this post was submitted on 20 Mar 2026
-13 points (43.4% liked)

Selfhosted

57715 readers
650 users here now

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.

Rules:

  1. Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don't duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.

  7. No low-effort posts. This is subjective and will largely be determined by the community member reports.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

founded 2 years ago
MODERATORS
 

I Built a Python script that uses a local Ollama LLM to automatically find and add movies to Radarr.

It picks random films from your library, asks Ollama for similar suggestions based on theme and atmosphere, validates against OMDb, scores with plot embeddings, then adds the top results to Radarr automatically.

Examples:

  • Whiplash → La La Land, Birdman, All That Jazz
  • The Thing → In the Mouth of Madness, It Follows, The Descent
  • In Bruges → Seven Psychopaths, Dead Man's Shoes

Features:

  • 100% local, no external AI API
  • --auto mode for daily cron/Task Scheduler
  • --genre "Horror" for themed movie nights
  • Persistent blacklist, configurable quality profile
  • Works on Windows, Linux, Mac

GitHub: https://github.com/nikodindon/radarr-movie-recommender

you are viewing a single comment's thread
view the rest of the comments
[–] four@lemmy.zip 3 points 1 day ago (1 children)

I'm not an expert, but LLMs should still be deterministic. If you run the model with 0 creativity (or whatever the randomness setting is called) and provide exactly the same input, it should provide the same output. That's not how it's usually configured, but it should be possible. Now, if you change the input at all (change order of movies, misspell a title, etc) then the output can change in an unpredictable way

[–] hendrik@palaver.p3x.de 2 points 1 day ago* (last edited 15 hours ago)

Yes. I think determinism a misunderstood concept. In computing, it means exact same input leads to always the same output. Could be a correct result or entirely wrong, though. As long as it stays the same, it's deterministic. There's some benefit in introducing randomness to AI. But it can be run in an entirely deterministic way as well. Just depends on the settings. (It's called "temperature".)