this post was submitted on 28 Nov 2025
581 points (94.6% liked)

Selfhosted

53234 readers
1536 users here now

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.

Rules:

  1. Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don't duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

founded 2 years ago
MODERATORS
 

I got into the self-hosting scene this year when I wanted to start up my own website run on old recycled thinkpad. A lot of time was spent learning about ufw, reverse proxies, header security hardening, fail2ban.

Despite all that I still had a problem with bots knocking on my ports spamming my logs. I tried some hackery getting fail2ban to read caddy logs but that didnt work for me. I nearly considered giving up and going with cloudflare like half the internet does. But my stubbornness for open source self hosting and the recent cloudflare outages this year have encouraged trying alternatives.

Coinciding with that has been an increase in exposure to seeing this thing in the places I frequent like codeberg. This is Anubis, a proxy type firewall that forces the browser client to do a proof-of-work security check and some other nice clever things to stop bots from knocking. I got interested and started thinking about beefing up security.

I'm here to tell you to try it if you have a public facing site and want to break away from cloudflare It was VERY easy to install and configure with caddyfile on a debian distro with systemctl. In an hour its filtered multiple bots and so far it seems the knocks have slowed down.

https://anubis.techaro.lol/

My botspam woes have seemingly been seriously mitigated if not completely eradicated. I'm very happy with tonights little security upgrade project that took no more than an hour of my time to install and read through documentation. Current chain is caddy reverse proxy -> points to Anubis -> points to services

Good place to start for install is here

https://anubis.techaro.lol/docs/admin/native-install/

top 50 comments
sorted by: hot top controversial new old
[–] Deathray5@lemmynsfw.com 2 points 52 minutes ago

Unrelated but one day I won't get gender envy from random cartoon woman

[–] drkt_@lemmy.dbzer0.com 11 points 12 hours ago

Stop playing wack-a-mole with these fucking people and build TARPITS!

Make it HURT to crawl your site illegitimately.

[–] sudoer777@lemmy.ml 12 points 13 hours ago* (last edited 13 hours ago) (2 children)

I host my main server on my own hardware, and a VPN on Hetzner because my shitty ISP doesn't let me port forward. For the past year, bots were hitting my Forgejo instance hard. I forgot to disable registration and they generated hundreds of accounts with hundreds of repos with sketchy links, generating terrabytes of traffic from my VPS, costing me money in traffic. I disabled registration and deleted the spam, and bots still kept hitting my server for several months, which would cause memory leaks over time and crash it and consume CPU, and still costed me money with terrabytes of traffic per month. A few weeks ago, I put Anubis on the VPS. Now, zero bots hit my Forgejo instance and I don't pay for their traffic anymore. Problem solved.

[–] WorldsDumbestMan@lemmy.today -1 points 2 hours ago (1 children)

Nice ads people! Good job!

[–] helix@feddit.org 2 points 57 minutes ago

So you think techaro paid them?

[–] Jason2357@lemmy.ca 4 points 11 hours ago

Its always code forges and wikis that are effected by this because the scrapers spider down into every commit or edit in your entire history, then come back the next day and check every “page” again to see if any changed. Consider just blocking pages that are commit history at your reverse proxy.

[–] daniskarma@lemmy.dbzer0.com 36 points 19 hours ago* (last edited 19 hours ago)

I don't think you have a usecase for Anubis.

Anubis is mainly aimed against bad AI scrappers and some ddos mitigation if you have a heavy service.

You are getting hit exactly the same, anubis doesn't put up a block list or anything. It just put itself in front of the service. The load on your server and the risk you take it's very similar anubis or not anubis here. Most bots are not AI scrappers they are just proving. So the hit on your server is the same.

What you want is to properly set up fail2ban or, even better, crowdsec. That would actually block and ban bots that try to prove your server.

If you are just self-hosting with Anubis the only thing you are doing is deriving the log noise towards Anubis logs and making your devices do a PoW every once in a while when you want to use your services.

Being honest I don't know what you are self hosting. But at least it's something that's going to get ddos or AI scrapped, there's not much point with Anubis.

Also Anubis is not a substitute for fail2ban or crowdsec. You need something to detect and ban brute force attacks. If not the attacker would only need to execute the anubis challenge get the token for the week and then they are free to attack your services as they like.

[–] smh@slrpnk.net 17 points 18 hours ago

The creator is active on a professional slack I'm on and they're lovely and receptive to user feedback. Their tool is very popular in the online archives/cultural heritage scene (we combine small budgets and juicy, juicy data).

My site has enabled js-free screening when the site load is low, under the theory that if the site load is too high then no one's getting in anyway.

[–] quick_snail@feddit.nl 24 points 23 hours ago (2 children)

Kinda sucks how it makes websites inaccessible to folks who have to disable JavaScript for security.

[–] poVoq@slrpnk.net 24 points 22 hours ago (5 children)

I kinda sucks how AI scrapers make websites inaccessible to everyone 🙄

[–] Mwa@thelemmy.club 4 points 16 hours ago

and they dont respect robots.txt

load more comments (4 replies)
[–] WhyJiffie@sh.itjust.works 13 points 19 hours ago (3 children)

there's a fork that has non-js checks. I don't remember the name but maybe that's what should be made more known

load more comments (3 replies)
[–] url@feddit.fr 21 points 23 hours ago (1 children)

Honestly im not a big fan of anubis . it fucks users with slow devices

https://lock.cmpxchg8b.com/anubis.html

[–] url@feddit.fr 14 points 23 hours ago

Did i forgot to mention it doesnt work without js that i keep disabled

[–] TerHu@lemmy.dbzer0.com 15 points 23 hours ago (1 children)

yes, please be mindful when using cloudflare. with them you’re possibly inviting in a much much bigger problem

https://www.devever.net/~hl/cloudflare

[–] quick_snail@feddit.nl 7 points 22 hours ago* (last edited 22 hours ago)

Great article, but I disagree about WAFs.

Try to secure a nonprofit's web infrastructure with as 1 IT guy and no budget for devs or security.

It would be nice if we could update servers constantly and patch unmaintained code, but sometimes you just need to front it with something that plugs those holes until you have the capacity to do updates.

But 100% the WAF should be run locally, not a MiTM from evil US corp in bed with DHS.

[–] non_burglar@lemmy.world 184 points 1 day ago (9 children)

Anubis is an elegant solution to the ai bot scraper issue, I just wish the solution to everything wasn't just spending compute everywhere. In a world where we need to rethink our energy consumption and generation, even on clients, this is a stupid use of computing power.

[–] Dojan@pawb.social 108 points 1 day ago* (last edited 1 day ago) (15 children)

It also doesn’t function without JavaScript. If you’re security or privacy conscious chances are not zero that you have JS disabled, in which case this presents a roadblock.

On the flip side of things, if you are a creator and you’d prefer to not make use of JS (there’s dozens of us) then forcing people to go through a JS “security check” feels kind of shit. The alternative is to just take the hammering, and that feels just as bad.

No hate on Anubis. Quite the opposite, really. It just sucks that we need it.

load more comments (15 replies)
load more comments (8 replies)
[–] Appoxo@lemmy.dbzer0.com 3 points 18 hours ago (2 children)

Maybe you know the answer to my question:
If I'd want to use any app that doesnt run in a webbrowser (e.g. the native jellyfin app), how would that work? Does it still work then?

[–] chaospatterns@lemmy.world 1 points 13 hours ago (1 children)

If the app is just a WebView wrapper around the application, then the challenge page would load and try to be evaluated.

If it's a native Android/iOS app, then it probably wouldn't work because the app would try to make HTTP API calls and get back something unexpected.

[–] Appoxo@lemmy.dbzer0.com 1 points 4 hours ago

Authelia already broke the functionality for jellyfin and symfonium.
So I guess the answer is no.

[–] SmokeyDope@piefed.social 1 points 15 hours ago (1 children)

It explicitly checks for web browser properties to apply challenges and all its challenges require basic web functionality like page refresh. Unless the connection to your server involves handling a user agents string it won't work, I think this I how it is anyway. Hope this helped.

[–] Appoxo@lemmy.dbzer0.com 1 points 14 hours ago

Assuming what you said is correct, it wouldnt help my use case.
Not hosting any page meant for public consumption anyway so it's not really important.
But thanks for answering :)

[–] 0_o7@lemmy.dbzer0.com 28 points 1 day ago (7 children)

I don't mind Anubis but the challenge page shouldn't really load an image. It's wasting extra bandwidth for nothing.

Just parse the challenge and move on.

load more comments (7 replies)
load more comments
view more: next ›