29
submitted 11 months ago* (last edited 11 months ago) by Trincapinones@lemmy.world to c/selfhosted@lemmy.world

Hi there! I have an old pc that I use as a server (I use Ubuntu Server) and I would like to add a NVIDIA 1050 to it (for jellyfin and guacamole).

In the past I tried to do it and somehow I corrupted the system when installing the video drivers, I have always had complications when installing NVIDIA drivers on Linux.

Could any of you help me know what is the right way to install the video drivers on a headless system and use it correctly with docker compose?

Thanks in advance ๐Ÿ˜„

top 16 comments
sorted by: hot top controversial new old
[-] m12421k@iusearchlinux.fyi 9 points 11 months ago

not exactly what you want. but check out gaming on whales Wolf docs. it's configured for gpu accelerated X11 apps on docker. if I recall correctly one of the documents explained settings up nvidia drivers on docker completely.

[-] abeltramo@lemmy.world 14 points 11 months ago

Wow it's the first time that I see someone suggesting my project out in the wild, thanks! Here's the Wolf quickstart guide

[-] vynlwombat@lemmy.world 6 points 11 months ago

What went wrong last time you tried installing the drivers?

Ubuntu has Nvidia drivers in the official repos: https://www.cyberciti.biz/faq/ubuntu-linux-install-nvidia-driver-latest-proprietary-driver/

Nvidia also has a docker image for GPUs: https://docs.nvidia.com/datacenter/cloud-native/container-toolkit/1.14.2/install-guide.html

[-] Trincapinones@lemmy.world 2 points 11 months ago* (last edited 11 months ago)

Thanks for the help! I'm trying to install the driver from the official repo but I don't know what version shoud I use for a headless docker setup with a 1050.

I think that's what went wrong last time, I tried to install the driver from the NVIDIA documentation thinking that "I'll just work" just like in windows ๐Ÿ˜“

Edit: I've solved it, now it works perfectly, Thanks again

[-] fraydabson@sopuli.xyz 6 points 11 months ago* (last edited 11 months ago)

Once you do get the drivers installed properly as per your OS (speaking of, which distro are you using? Edit: nvm I see you are on Ubuntu), here are the steps to give docker access to it: https://docs.nvidia.com/datacenter/cloud-native/container-toolkit/latest/install-guide.html

[-] Trincapinones@lemmy.world 1 points 11 months ago

Thank you! Last time I used the official desktop drivers because most guides recommended them. When I install the NVIDIA Container Toolkit, I can include the GPU in the runtime of the container and it should work, right?

[-] fraydabson@sopuli.xyz 3 points 11 months ago* (last edited 11 months ago)

I used the desktop drivers as well (on arch from the extra repo) for my headless arch server.

Regarding nvidia container toolkit once it was installed I added this to my Jellyfin docker compose:

deploy:
      resources:
        reservations:
          devices:
            - driver: nvidia
              capabilities: [gpu]

Then to confirm, I did docker exec -it jellyfin nvidia-smi Which responded with my GPU. Note that (for me) the "processes" part of nvidia-smi comes up blank, even when Jellyfin is using it. I can tell it is working though from jellyfin logs and when it is not using it, instead of being blank it says "no processes"

Edit for formatting and to add that I believe I also had to add an environment variable to jellyfin (I am using lsio's version)

      - NVIDIA_DRIVER_CAPABILITIES=all
      - NVIDIA_VISIBLE_DEVICES=all
[-] dandroid@dandroid.app 0 points 11 months ago

I recently did this and found those instructions to be beyond useless. The repository URIs were all old and dead. Not sure if they updated this doc since then, but they combined all the deb-based distros into one repo and and all the rpm-based distros into another repo.

[-] fraydabson@sopuli.xyz 1 points 11 months ago

Yeah thankfully I use Arch Linux. Their wiki guide was much better.

[-] netwren@lemmy.world 2 points 11 months ago

Not docker but you could do k3s and use the Nvidia GPU operator to manage installing video drivers for you on your single node cluster.

[-] notfromhere@lemmy.one 2 points 11 months ago

Do you have a link to a tutorial on this? Iโ€™ve been thinking about adding my amd64 server with an nVidia GPU to my Raspberry Pi K3s cluster.

[-] netwren@lemmy.world 3 points 11 months ago

No! Maybe I should work on this because it was fairly simple for me to do after some research.

[-] notfromhere@lemmy.one 1 points 11 months ago

That would be pretty dope. If you end up writing it up donโ€™t forget about me ๐Ÿ˜

[-] Trincapinones@lemmy.world 2 points 11 months ago

I didn't know about it, thanks! I'll give it a try ๐Ÿ˜„

[-] netwren@lemmy.world 1 points 11 months ago

It actually was pretty straightforward. Saying this from experience as I used a tensortt container image with a 1060 for image clarification

[-] possiblylinux127@lemmy.zip -1 points 11 months ago

A assume you already have the Nvidia card? If you are buying one go for amd

this post was submitted on 16 Oct 2023
29 points (93.9% liked)

Selfhosted

39239 readers
291 users here now

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.

Rules:

  1. Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don't duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (donโ€™t cherry-pick information from the title to fit your agenda).

  6. No trolling.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

founded 1 year ago
MODERATORS