this post was submitted on 29 Apr 2024
595 points (97.6% liked)
Technology
60052 readers
2859 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Seems like this was acknowledged, but a good point nonetheless (that's often overlooked).
I'm currently sitting on 4TB of data (that's largely movies and TV shows), running on 4-year-old hardware, with 3 local replicants, backed up to cloud.
My power and cloud costs are trivial - about 25 cents a day - that's less than $100/year (after hardware costs, which come out to about $150/year to continue with similar performance levels). My 4 year old "server" idles at about 20 watts. I can probably bring this down to perhaps 10w with a newer NUC or similar.
I could easily store everything my extended family produces (including cousins, about 50 people) with a similar setup. In fact, I'm working on just such a project - an SFF or NUC type device with sufficient.
Edit: autocorrect changed $100 to $10