this post was submitted on 31 Mar 2026
3 points (80.0% liked)

Linux

64446 readers
215 users here now

From Wikipedia, the free encyclopedia

Linux is a family of open source Unix-like operating systems based on the Linux kernel, an operating system kernel first released on September 17, 1991 by Linus Torvalds. Linux is typically packaged in a Linux distribution (or distro for short).

Distributions include the Linux kernel and supporting system software and libraries, many of which are provided by the GNU Project. Many Linux distributions use the word "Linux" in their name, but the Free Software Foundation uses the name GNU/Linux to emphasize the importance of GNU software, causing some controversy.

Rules

Related Communities

Community icon by Alpár-Etele Méder, licensed under CC BY 3.0

founded 6 years ago
MODERATORS
 

I run a small home lab - number of servers varies from time to time. Currently five, all Linux.

When I heard about log consolidation I imagined that I would get a nice dashboard type view where I could see a consolidated, real time, view of all my server logs go by. Victoria Logs does that for me. I also imagined that there would be a way to flag particular log entries as "normal, and expected" so they would be excluded in the future - the goal being to get this dashboard to a state where if anything appears, it's probably bad. I can't see a way to do that in Victoria Logs. Do I need to try harder? If Victoria Logs won't do it - is there anything that will?

top 9 comments
sorted by: hot top controversial new old
[–] Shadow@lemmy.ca 1 points 1 week ago

VL is really about aggregation, not displaying it. You'd probably just need to setup a grafana dashboard with filters for all your normal traffic

[–] Brummbaer@pawb.social 0 points 1 week ago (1 children)

What are you using to ship the logs to VL?

If you want to exclude "normal" logs you should start excluding them before they reach VL, so the only logs you have are the interesting ones.

[–] GreatBlueHeron@lemmy.ca 1 points 1 week ago

What are you using to ship the logs to VL?

That's the reason I'm here asking about logging. I'm in the process of changing and wondering if I should switch it all up. I was using systemd-journal-remote, but I'm switching from Debian to Alpine so - no more systemd.

you should start excluding them before they reach VL

Now that confuses me. As I said in my original post - I had some preconceptions about centralised logging before I set it up, and having a single place to manage filters was certainly something I was hoping to get from it. Also any filtering would only be for reporting. I'd like to keep a full set of log data for potential problem analysis etc.

[–] hades@feddit.uk 0 points 1 week ago (1 children)

In case you decide to look for alternatives, I would probably go with elastic/filebeat/grafana, a fairly standard log monitoring suite. Not saying it’s better or worse than Victoria Logs, which i have no experience with.

[–] GreatBlueHeron@lemmy.ca 1 points 1 week ago (1 children)

I'm already running a grafana instance, so I'll look into elastic/filebeat. Thanks.

[–] Shadow@lemmy.ca 2 points 1 week ago (1 children)

Elastic is heaaaaavy. You might want to check out Loki, I haven't used it but I think it'd be easier to get started with than Victoria logs since it integrates tightly with grafana

[–] GreatBlueHeron@lemmy.ca 1 points 1 week ago (1 children)

Yeah, I've been doing some more reading. Victoria Logs is doing a good job consolidating my logs and is very lightweight. It's the visualisation that I'm missing. Grafana can do it, but I'm having trouble getting my head around it. That's OK - it's just my home lab and it's mainly a learning exercise - I need to learn some more.

[–] Shadow@lemmy.ca 2 points 1 week ago (1 children)

Yeah I use VL for lemmy.ca and it's super quick and lightweight, but getting what you want into grafana can be difficult.

[–] Brummbaer@pawb.social 1 points 1 week ago

The more you can filter and label at the source, the less you have to work out in VL.

I use alloy (which is kinda heavy) to extract and prepare only the data I want and it works great so far.