view the rest of the comments
Selfhosted
A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.
Rules:
-
Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.
-
No spam posting.
-
Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.
-
Don't duplicate the full text of your blog or github here. Just post the link for folks to click.
-
Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).
-
No trolling.
Resources:
- selfh.st Newsletter and index of selfhosted software and apps
- awesome-selfhosted software
- awesome-sysadmin resources
- Self-Hosted Podcast from Jupiter Broadcasting
Any issues on the community? Report it using the report flag.
Questions? DM the mods!
Personally going 10G on my networking stuff has significantly improved my experience with self-hosting, especially when it comes to file transfers. 1G can just be extremely slow when you're dealing with large amounts of data so I also don't really understand why people recommend against 10G here of all places.
I think it has to do with data differences between self hosters and data hoarders.
Example: a self hosted with an RPI home assistant setup and a N100 server with some paperwork, photos, nextcloud, and a small jellyfin library.
A few terabytes of storage and their goal is to replace services they paid for in an efficient manner. Large data transfers will happen extremely rarely and it would be limited in size, likely for backing up some important documents or family photos. Maybe they have a few hundred Mbit internet max.
Vs
A data hoarder with 500TB of raid array storage that indexes all media possible, has every retail game sold for multiple consoles, has taken 10k RAW photos, has multiple daily and weekly backups to different VPS storages, hosts a public website, has >gigabit internet, and is seeding 500 torrents at a given time.
I would venture to guess that option 1 is the vast majority of cases in selfhosting, and 10Gb networking is much more expensive for limited benefit for them.
Now on a data hoarding community, option 2 would be a reasonable assumption and could benefit greatly from 10Gb.
Also 10Gb is great for companies, which are less likely to be posting on a self hosted community.
I somewhat disagree that you have to be a data hoarder for 10G to be worth it. For example I've got a headless steam client on my server that has my larger games installed (all in all ~2TB so not in data hoarder territories) which allows me to install and update those games at ~8 Gbit/s. Which in turn allows me to run a leaner Desktop PC since I can just uninstall the larger games as soon as I don't play them daily anymore and saves me time when Steam inevitably fails to auto update a game on my Desktop before I want to play it.
Arguably a niche use case but it exists along side other such niche use cases. So if someone comes into this community and asks about how best to implement 10G networking I will assume they (at least think) have such a use case on their hands and want to improve that situation a bit.
And X-windows. There's a few server tasks that I just find easier with gui, and they feel kind of laggy over 1G. Not to mention an old Windows program running in WINE over Xwin. All kind of things you can do, internally, to eat up bandwidth.