Chrome: Sees new website domain
Google: 👀
A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.
Rules:
Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.
No spam posting.
Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.
Don't duplicate the full text of your blog or github here. Just post the link for folks to click.
Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).
No trolling.
No low-effort posts. This is subjective and will largely be determined by the community member reports.
Resources:
Any issues on the community? Report it using the report flag.
Questions? DM the mods!
Chrome: Sees new website domain
Google: 👀
When a CA issues an SSL/TLS certificate, they're required to submit it to public CT logs (append-only, cryptographically verifiable ledgers). This was designed to detect misissued or malicious certificates.
Red and Blue team alike use this resource (crt.sh) to enumerate subdomains.
if you use Let’s Encrypt (ACME protocol) AFAIK you can find all domains registered in a directory that even has a search, no matter if it’s wildcard or not.
It was something like this https://crt.sh/ but can’t find the site exactly anymore
LE: you can also find some here https://search.censys.io/
Holy shit, this has every cert I’ve ever generated or renewed since 2015.
Certificate Transparency makes public all issued certificates in the form of a distributed ledger, giving website owners and auditors the ability to detect and expose inappropriately issued certificates.
This.
That's why temping obscurity for security is not a good idea. Doesn't take much to be "safe", at least reasonably safe. But that not much its good practice to be done :)
No. Not this.
Op is doing hidden subdomain pattern. Wildcard dns and wildcard ssl.
This way subdomain acts as a password and application essentially inaccessible for bot crawls.
Works very well
Hmm. I feel like conflating a subdomain with a password is a particularly sketchy idea, but you do you.
I can't say I know the answer but a few ideas:
You could try it again, create the domain in the config and then do absolutely nothing. Don't try to confirm it works in any way. If you don't see the same behaviour you can do one of the above and then the other and see when it kicks in. If it gets picked up without you doing anything..then pass!
Acronyms, initialisms, abbreviations, contractions, and other phrases which expand to something larger, that I've seen in this thread:
| Fewer Letters | More Letters |
|---|---|
| CA | (SSL) Certificate Authority |
| DNS | Domain Name Service/System |
| IP | Internet Protocol |
| SSL | Secure Sockets Layer, for transparent encryption |
| TLS | Transport Layer Security, supersedes SSL |
| VPN | Virtual Private Network |
| VPS | Virtual Private Server (opposed to shared hosting) |
[Thread #990 for this comm, first seen 11th Jan 2026, 01:25] [FAQ] [Full list] [Contact] [Source code]
Good bot
Kudos to the bot.
For anyone who needs to read it: At the end of the day this is obscurity, not security; however obscurity is a good secondary defense because it buys time.
I too would be interested to learn how this leaked
it's not even obscurity; it's logged publicly.
It's not. Wildcard DNS and wildcard cert. Domain is not logged publicly.
People that keep saying logged publicly simply don't understand setup and technology
How is it being logged publicly? Like OP said there is no specific subdomain registered in the DNS records (instead using a wildcard). Same for the SSL cert. Only things I can think of is the browser leaking the subdomains (through google or Microsoft) or the DNS queries themselves being logged and leaked. (Possibly by the ISP inspecting the traffic or logging and leaking on their own DNS servers?). I would hardly call either of those public.
Maybe that particular subdomain is getting treated as the default virtual host by Apache? Are the other subdomains receiving scans too?
I don't use Apache much, but NGINX sometimes surprises on what it uses if the default is not specifically defined.
If you have browser with search suggestions enabled, everything you type in URL bar gets sent to a search engine like Google to give you URL suggestions. I would not be surprised if Google uses this data to check what it knows about the domain you entered, and if it sees that it doesn't know anything, it sends the bot to scan it to get more information.
But in general, you can't access a domain without using a browser which might send that what you type to some company's backend and voila, you leaked your data.
Easily verified by creating another bunch of domains and using a browser that doesn't do tracking - like waterfox
What you can do is segregate networks.
If the browser runs in, say, a VM with only access to the intranet and no internet access at all, this risk is greatly reduced.
You need to look at the DNS server used by whatever client is resolving that name. If it's going to an external recursive resolver instead of using your own internal DNS server then you could be leaking lookups to the wider internet.
Crawlers typically crawl by ip.
Are u sure they just not using ip?
U need to expressly configure drop connection if invalid domain.
I use similar pattern and have 0 crawls.
+1 for dropped connections on invalid domains. Or hell, redirect them to something stupid like ooo.eeeee.ooo just so you can check your redirect logs and see what kind of BS the bots are up to.
Is this at a webserver level?
It can be both server and DNS provider. For instance, Cloudflare allows you to set rules for what traffic is allowed. And you can set it to automatically drop traffic for everything except your specific subdomains. I also have mine set to ban a IP after 5 failed subdomain attempts. That alone will do a lot of heavy lifting, because it ensures your server is only getting hit with the requests that have already figured out a working subdomain.
Personally, I see a lot of hacking attempts aimed at my main www. subdomain, for Wordpress. Luckily, I don’t run Wordpress. But the bots are 100% out there, just casually scanning for Wordpress vulnerabilities.
A long time ago, I turned a PC in my basement into a web server. No DNS. Just a static IP address. Within 15 minutes, the logs showed it was getting scanned.
SSL encrypts traffic in-transit. You need to set up auth/access control. Even better, stick it behind a Web Application Firewall.
Or set up a tunnel. Cloudflare offers a free one: https://developers.cloudflare.com/cloudflare-one/networks/connectors/cloudflare-tunnel/
Do post again if you figure it out!
Will do!
My guess would be NSEC zone walking if your DNS provider supports DNSSEC. But that shouldn't work with unregistered or wildcard domains
The next guess would be during setup, someone somewhere got ahold of your SNI (and/or outgoing DNS requests). Maybe your ISP/VPN service actually logs them and announce it to the world
I suggest next time, try setting up without any over-the-internet traffic at all. E.g. always use curl with the --resolve flag on the same VM as Apache to check if it's working
We're always watching.
You say you have a wildcard cert but just to make sure: I don't suppose you've used ACME for Letsencrypt or some other publicly trusted CA to issue a cert including the affected name? If so it will be public in Certificate Transparency Logs.
If not I'd do it again and closely log and monitor every packet leaving the box.
The random name is not in the public log. Someone else suggested that earlier. I checked CRT.sh and while my primary domain is there, the random one isn't.
My next suspicion from what you've shared so far apart from what others suggested would be something out of the http server loop.
Have you used some free public DNS server and inadvertently queried it with the name from a container or something? Developer tooling building some app with analytics not disabled? Any locally connected AI agents having access to it?
Are you sure they're hitting the hostname and not just the IP directly?
Shows up by name in the apache other_hosts...log, so yes
I need to make sure to 444 drop connection immediately if wrong domain. Redirect to https should be configured after - I suspect ur config redirects
I believe that some DNS servers are configured to allow zone transfers without any kind of authentication. While properly configured servers will whitelist the IPs of secondaries they trust, for those that don't, hackers can simply request a zone transfer and get all subdomains at once.
I don't have any subdomains registered with DNS.
I attempted dig axfr example.com @ns1.example.com returned zone transfer DENIED
Reverse DNS? Or vuln scans just hitting IPs. Don't need DNS for that.
Did you yourself make a request to it or just set it up and not check it? My horrifying guess it that if you use SNI in a request every server in the middle could read the subdomain and some system in the internet routing is untrustworthy.