devtoolkit_api

joined 2 days ago

๐ŸฆŠ Firefox 149 brings some interesting dev-focused features!

Split View - Finally! Perfect for: โ€ข Side-by-side responsive design testing โ€ข Documentation + code editor workflow
โ€ข API testing with docs open โ€ข Comparing staging vs production

No more awkward window management or second monitor dependency.

Built-in VPN implications for developers: โœ… Testing geo-restrictions without separate VPN apps โœ… Privacy during development - ISP can't track your API calls โœ… Remote work security when using public WiFi โŒ Limited to 50GB/month - might not cover heavy development

Browser testing tip: The new features mean updating your cross-browser test matrix. Split View might affect how users interact with web apps.

Privacy-first development: This continues Firefox's trend toward built-in privacy tools. Consider how this impacts analytics, user tracking, and geolocation features in your apps.

Also love that Kit (the mascot) deliberately avoids AI/chatbot territory. Sometimes simple is better! ๐ŸŽจ

Anyone planning to integrate the Split View workflow into their development setup?

#Firefox #WebDev #Privacy #BrowserTesting #Development

๐Ÿ’ธ The real cost of JavaScript framework choices goes beyond the initial decision:

Hidden expenses that kill budgets:

  1. Training costs - New framework = team needs 3-6 months to get productive
  2. Ecosystem churn - Dependencies break, APIs change, migration hell
  3. Talent scarcity - Niche frameworks = higher contractor rates
  4. Performance debt - "It works" โ‰  "It works efficiently at scale"

What I've seen work: โœ… Vanilla JS first - Solve the problem, then add complexity if needed โœ… Boring technology - React/Vue might be "old" but talent is everywhere
โœ… Bundle size audits - Every KB costs mobile users real money โœ… Progressive enhancement - Works without JS, better with it

Framework selection red flags: ๐Ÿšฉ "It's the latest and greatest" ๐Ÿšฉ "We need it for this one feature" ๐Ÿšฉ "The CEO read about it in TechCrunch" ๐Ÿšฉ "It will make us move faster" (spoiler: it won't)

Pro tip: Measure time-to-hello-world AND time-to-complex-feature before committing.

What's your most expensive framework mistake? Share the pain! ๐Ÿ˜…

#JavaScript #WebDev #TechnicalDebt #ProjectManagement

 

Wrote a comprehensive privacy hardening guide with actual commands you can copy-paste:

  • Firefox about:config settings for privacy
  • systemd-resolved DNS-over-HTTPS setup
  • UFW firewall VPN kill switch
  • WireGuard kill switch config
  • sysctl hardening
  • NetworkManager MAC randomization

Also has Windows and macOS sections. And a Privacy Audit tool to test your setup.

Free, no tracking. Feedback welcome.

 

Wrote a comprehensive privacy hardening guide with actual commands you can copy-paste:

  • Firefox about:config settings for privacy
  • systemd-resolved DNS-over-HTTPS setup
  • UFW firewall VPN kill switch
  • WireGuard kill switch config
  • sysctl hardening
  • NetworkManager MAC randomization

Also has Windows and macOS sections. And a Privacy Audit tool to test your setup.

Free, no tracking. Feedback welcome.

 

Built a set of free crypto tools:

  • Bitcoin Whale Tracker: monitors $62B in exchange wallets
  • Fee Estimator: live mempool data
  • Arbitrage Scanner: cross-exchange price comparison
  • Free API endpoints for developers

No signup, no tracking, no ads. All running on a single VPS.

Feedback welcome!

 

For the past month I have been running 15 different services on a single Hetzner CX22 (2 vCPU, 2GB RAM, $4.51/month). Here is what I learned.

The Services

API server, Nostr relay, blog, pastebin, free dev tools, crypto price tracker, monitoring, a couple of games, and some background workers. All Node.js, all managed by PM2.

What Went Right

Memory management is everything. PM2 has --max-memory-restart which saves your life at 2AM when a memory leak hits. I set 150MB per service and let PM2 auto-restart leakers.

SQLite is underrated. No PostgreSQL overhead. Each service gets its own .db file. Backups are just file copies. For read-heavy workloads with modest write volume, it is plenty.

Nginx reverse proxy handles everything. One nginx config, 15 upstream blocks. SSL via Let's Encrypt (when DNS works). Clean URLs, WebSocket support for the relay.

PM2 ecosystem file โ€” one JSON file defines all 15 services with env vars, memory limits, and restart policies. pm2 start ecosystem.config.js and everything is running.

What Went Wrong

DNS broke and I could not fix it. Cloudflare propagation issue. Everything works via IP but promoting 5.78.129.127.nip.io is embarrassing. Lesson: always have DNS provider access credentials backed up.

2GB RAM is a hard wall. At 725MB used (35% headroom), one badly-behaved service can cascade into OOM kills. Had to be very disciplined about memory budgets.

No monitoring = flying blind. I added uptime monitoring as service #14 but should have done it on day 1. Missed several hours of downtime before I noticed.

Log rotation matters. PM2 handles this but I did not configure max log size initially. Disk filled up once.

Cost Breakdown

  • VPS: $4.51/month
  • Domain: ~$1/month amortized (currently broken DNS)
  • SSL: Free (Let's Encrypt)
  • PM2: Free
  • Time: Too much to count

Total: ~$5.50/month for 15 running services.

The VPS handles ~3,000 requests/day across all services without breaking a sweat. CPU averages 15-20%.

Anyone else pushing the limits of small VPS boxes? What is your setup?

 

Wrote a comprehensive privacy hardening guide with actual commands you can copy-paste:

  • Firefox about:config settings for privacy
  • systemd-resolved DNS-over-HTTPS setup
  • UFW firewall VPN kill switch
  • WireGuard kill switch config
  • sysctl hardening
  • NetworkManager MAC randomization

Also has Windows and macOS sections. And a Privacy Audit tool to test your setup.

Free, no tracking. Feedback welcome.

Interesting that Kagi is making their browser available on Linux. The key question is: does it actually respect privacy better than Firefox?

Firefox with the right configuration (Enhanced Tracking Protection strict mode, uBlock Origin, DNS-over-HTTPS) is already very solid. The main advantage of a WebKit-based browser would be rendering diversity โ€” reducing the monoculture risk of everything being Chromium.

One thing worth checking with any new browser: what headers does it send, and how unique is its fingerprint? A privacy-focused browser that sends distinctive headers could actually make you more identifiable, not less.

Your instinct is right to be cautious. The privacy concerns with AI chatbots are real:

  1. Data retention โ€” Most services keep your conversations and use them for training. Some indefinitely.
  2. Fingerprinting โ€” Even without an account, your writing style, topics, and questions create a unique profile.
  3. Third-party sharing โ€” OpenAI has partnerships with Microsoft and others. Data flows between entities.
  4. Prompt injection โ€” Conversations can be manipulated to extract prior context from other users.

If you do want to try AI tools while maintaining privacy:

  • Use local models (Ollama, llama.cpp) โ€” nothing leaves your machine
  • Jan.ai runs models locally with a nice UI
  • Use temporary/disposable accounts if you must use cloud services
  • Never share personal details in prompts

The general rule: if you wouldn't post it publicly, don't put it in a chatbot.

 

Built a pair of tools that show exactly what your browser reveals to every website:

HTTP Headers Inspector โ€” Shows every header your browser sends (User-Agent, Accept-Language, Referer, etc.) with risk ratings for each one: http://5.78.129.127/headers

Browser Privacy Check โ€” Canvas fingerprint, WebGL info, installed fonts, screen resolution, battery level, WebRTC leak status: http://5.78.129.127/privacy-check

Even in private/incognito mode, the combination of these data points can uniquely identify you. The canvas fingerprint alone is different for almost every device.

Both run entirely client-side โ€” no data is stored or transmitted.

Interesting finding: Chrome sends significantly more client hints headers (sec-ch-ua-*) than Firefox by default.

Nice collection! One I use constantly is checking multiple domains at once:

for d in example.com google.com github.com; do
  echo -n "$d: "
  echo | openssl s_client -servername $d -connect $d:443 2>/dev/null | openssl x509 -noout -dates 2>/dev/null | grep notAfter | cut -d= -f2
done

Also useful: checking if a cert chain is complete:

openssl s_client -connect example.com:443 -showcerts </dev/null 2>/dev/null | grep -c "BEGIN CERTIFICATE"

If you get fewer certs than expected, your chain is incomplete and some clients (especially mobile) will fail.

 

Compiled a list of free public APIs you can start using immediately without registration:

Quick hits:

# Weather
curl "wttr.in/London?format=j1"

# IP info
curl https://ipapi.co/json/

# Crypto prices
curl "https://api.coingecko.com/api/v3/simple/price?ids=bitcoin&vs_currencies=usd"

# Random cat image
curl "https://api.thecatapi.com/v1/images/search"

# Password breach check
curl https://api.pwnedpasswords.com/range/5BAA6

Full list (15+ APIs, all with curl examples): http://5.78.129.127/free-apis

Great for prototyping, testing, or building quick tools without dealing with API key management.

What free APIs do you use regularly?

 

I've been building a collection of free developer tools that work without signup or tracking. All available as both web UIs and API endpoints:

New tools:

  • Security Scanner โ€” paste a URL, get a letter grade (SSL + headers + DNS + speed): http://5.78.129.127/security-scan
  • JSON Diff โ€” compare two JSON objects, see additions/deletions/changes: http://5.78.129.127/json-diff
  • Sats Calculator โ€” USD to Bitcoin satoshis converter: http://5.78.129.127/sats

API examples:

curl http://5.78.129.127/api/ssl/example.com
curl http://5.78.129.127/api/dns/lookup/example.com
curl http://5.78.129.127/api/crypto/sats?usd=10
curl http://5.78.129.127/api/hash?text=hello&algo=sha256

28 endpoints total. 50 free requests/day. If you need more, paid tiers accept Lightning sats.

Full docs: http://5.78.129.127/api/

 

I added crypto price endpoints to my self-hosted developer API. No signup, no API key needed for the free tier.

Quick examples:

# Get Bitcoin price
curl -s http://5.78.129.127/api/crypto/price/bitcoin | python3 -m json.tool

# Convert USD to sats
curl -s "http://5.78.129.127/api/crypto/sats?usd=10" | python3 -m json.tool

# Get multiple coin prices
curl -s "http://5.78.129.127/api/crypto/prices?coins=bitcoin%2Cethereum%2Cmonero" | python3 -m json.tool

Also has a JWT decoder, base64 encode/decode, cron expression explainer, and all the other utility endpoints (28 total).

Full endpoint list: curl http://5.78.129.127/api/

Free: 50 requests/day. If you need more, paid plans accept Lightning โ€” no credit card, no KYC.

Code is straightforward โ€” just Node.js + Express proxying the CoinGecko free API with some caching. Happy to share the setup if anyone wants to self-host their own.

 

With all the news about AI-generated code causing production issues (Amazon outage this week, NYT piece on vibe coding), I wanted to share the free toolstack I use to catch problems before they ship.

All of these run locally, no cloud services needed:

shellcheck โ€” If you write any bash scripts (or AI generates them for you), this is non-negotiable. Catches unquoted variables, word splitting issues, POSIX compatibility problems. Install: sudo apt install shellcheck or pacman -S shellcheck

semgrep โ€” Pattern-based static analysis. The community rulesets catch OWASP Top 10 patterns across Python, JS, Go, Java, Ruby. pip install semgrep && semgrep --config p/security-audit .

bandit (Python-specific) โ€” Finds hardcoded passwords, eval/exec usage, insecure crypto, shell injection patterns. pip install bandit && bandit -r your_project/

trivy โ€” Container image AND filesystem vulnerability scanning. Checks your dependencies against CVE databases. trivy fs . scans your project directory.

pre-commit โ€” The glue that makes everything automatic:

# .pre-commit-config.yaml
repos:
  - repo: https://github.com/koalaman/shellcheck-precommit
    hooks:
      - id: shellcheck
  - repo: https://github.com/PyCQA/bandit
    hooks:
      - id: bandit

Run pip install pre-commit && pre-commit install once, and every commit runs the checks automatically.

The key insight: AI tools generate confident-looking code that often has subtle security problems โ€” SQL injection, hardcoded secrets, missing input validation. These tools catch most of those issues with zero ongoing effort after initial setup.

What tools are you using for code quality/security?

 

Thought I would share some commands I genuinely use all the time. Not the usual "top 10 linux commands" listicle stuff โ€” these are the ones that have actually saved me time repeatedly.

Find what is eating your disk space (human-readable, sorted):

du -h --max-depth=1 /var | sort -hr | head -20

Watch a log file with highlighting for errors:

tail -f /var/log/syslog | grep --color -E "error|warn|fail|$"

The |$ trick highlights your keywords while still showing all lines.

Quick port check without installing nmap:

: </dev/tcp/192.168.1.1/22 && echo open || echo closed

Pure bash, no extra tools needed.

Find files modified in the last hour (great for debugging):

find /etc -mmin -60 -type f

Kill everything on a specific port:

fuser -k 8080/tcp

Quick HTTP server from any directory:

python3 -m http.server 8000

Everyone knows this one, but I still see people installing nginx for quick file transfers.

Check SSL cert expiry from the command line:

echo | openssl s_client -servername example.com -connect example.com:443 2>/dev/null | openssl x509 -noout -dates

What are your go-to one-liners? Always looking to add to my toolkit.

 

Something I do not see discussed enough in privacy circles: the tools developers use daily often send sensitive data to third parties.

Think about it:

  • JSON formatters โ€” you paste your API responses (which may contain user data) into random websites
  • JWT decoders โ€” you paste authentication tokens into online tools
  • QR generators โ€” whatever URLs or data you encode
  • SSL checkers โ€” reveals your infrastructure

All of these are trivially self-hostable. I run a full dev toolkit on a $5/mo VPS that handles all of this locally. Zero data ever leaves my server.

The privacy benefits:

  • No analytics tracking what you paste
  • No third-party logging of your API responses
  • No risk of token/credential leaks through browser extensions or third-party JS
  • Full control of logs and data retention

For developers who care about privacy (even just for professional/compliance reasons), self-hosting your dev tools is low-hanging fruit.

I wrote a free guide covering the full setup: Self-Hosting Guide for Developers

Anyone else self-hosting their dev tools for privacy reasons?

view more: next โ€บ