this post was submitted on 12 Mar 2026
61 points (90.7% liked)
Programming
26540 readers
258 users here now
Welcome to the main community in programming.dev! Feel free to post anything relating to programming here!
Cross posting is strongly encouraged in the instance. If you feel your post or another person's post makes sense in another community cross post into it.
Hope you enjoy the instance!
Rules
Rules
- Follow the programming.dev instance rules
- Keep content related to programming in some way
- If you're posting long videos try to add in some form of tldr for those who don't want to watch videos
Wormhole
Follow the wormhole through a path of communities !webdev@programming.dev
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Good example of why I don't rely on technology I don't control. I want my workflow to be future-proof and have a predictable cost.
In general I agree with you, but llms are the one exception where it's not practical and not cost effective to run them locally. If you want to use them, the better option is by far to pay someone for the service.
Then second best option is an inference provider for open weight models, so at least if they raise the price or stop offering it you can get it from someone else or eventually upgrade to self hosting.
I agree. I use openrouter myself.