this post was submitted on 04 Dec 2025
9 points (90.9% liked)

Hardware

5809 readers
16 users here now

This is a community dedicated to the hardware aspect of technology, from PC parts, to gadgets, to servers, to industrial control equipment, to semiconductors.

Rules:

founded 5 years ago
MODERATORS
 

I'm deploying Frigate (camera NVR software) at work for our 100something security cams. We only buy Dell servers which is a shame because I would have probably went Supermicro in this instance. Anyway. I'm doing some research and it's really unclear to me what Dell server I should go with if I intend to install a GPU in it.

I'm thinking I'll probably go refurbished, like one generation or two old.

Should I go with a 4U server, and if I did, would that eliminate the need for a PCIe riser card?

Do I need a datacenter class GPU? I have read that many of the powerful consumer cards will simply be too large for the server case.

Right now I am testing with an R550 and there is only one available 8 pin on the power supply. How do I power a 12 pin on the GPU if all that is available?

you are viewing a single comment's thread
view the rest of the comments
[–] bulwark@lemmy.world 5 points 11 hours ago* (last edited 9 hours ago)

I've used Frigate for a few years with up to 5 cameras, but 100 might be pushing it for a single card. I'm fond of the Google Coral M.2 chips for inference like the software maintainer recommends. You would need about ~5-10 I'd guess, and 1 low tier GPU if you're not transcoding too much. I talked to the guy that made the project a few years ago when it was still small, and he helped me with FFmpeg parameters to get Cuda h.265 decoding. Which is also important, depending on your cameras. Maybe talk to him directly through GitHub.