This is an automated archive made by the Lemmit Bot.
The original was posted on /r/homeassistant by /u/anvarazizov on 2026-02-18 21:24:27+00:00.
Hey r/homeassistant,
I live in Ukraine. russia regularly attacks our power grid — when it goes down, internet and cell towers follow within hours. My Home Assistant keeps running on battery backup, but I can't reach it from outside. So I built a radio bridge.
How it works
Two Lilygo T-Echo radios (~$30 each, LoRa 433MHz, Meshtastic firmware). One plugged into my Mac mini via USB. The other one is portable with me. A Python listener daemon sits between the radio and Home Assistant, routing commands and returning sensor data — all over encrypted LoRa. HA runs on a Home Assistant Green.
What I can do from the radio
Smart home control:
- Turn lights on/off
- Check temperature from Aqara sensors (I have 3 around the house)
- Check power status — grid on/off, battery levels (EcoFlow, Zendure)
- Check who's home
Voice messages (this is the fun part):
- Type
SAY: Привіт, я скоро буду вдома(Hey, I'll come back home soon) on the T-Echo - Listener calls
tts.google_translatewith Ukrainian language - HA Voice PE speaker reads it aloud at home
- Zero internet. Just radio → Mac mini → HA TTS → speaker
Camera snapshots:
- Ask "what's outside?" via radio or Discord
- Listener grabs snapshots from Tapo C120 + C100 (via HA camera proxy API)
- Runs them through a local vision model (gemma3:12b on Ollama)
- Sends me a text description: "5 cars parked, no people, snowy"
- Hourly automated monitoring logs everything
Proactive alerts:
- The AI monitors power status
- Power goes out → LoRa message to my radio within seconds
- Also sends battery levels and temperature
The HA integration
The listener talks to HA through the REST API:
GET /api/states/{entity_id}— read sensorsPOST /api/services/{domain}/{service}— control devicesGET /api/camera_proxy/{camera_entity}— grab snapshotsPOST /api/services/tts/speak— voice messages
Incoming radio messages get classified by a local LLM (phi4-mini) — "is this a smart home command, a question, or a TTS request?" Then routed to the right HA service or to a larger model (gemma3:12b) for general questions.
Architecture
T-Echo (portable)
│ LoRa 433MHz, encrypted
▼
T-Echo (USB) → Mac mini
│
├── SAY: prefix → tts.google_translate → Voice PE speaker
├── Smart home → Home Assistant REST API
├── Camera → camera_proxy → gemma3 vision → description
├── AI questions → phi4-mini → gemma3:12b (local via Ollama)
└── Alerts → outbox .msg files → LoRa TX
Why this matters
HA on battery backup is great, but useless if you can't reach it. The radio bridge means:
- No dependency on WiFi, internet, or cell towers
- Encrypted communication (Meshtastic PSK)
- ~1-3 km urban range with stock T-Echo antenna (extendable with mesh nodes)
- Total cost: ~$60 for two radios
Entities I use
camera.tapo_c120_hd_stream/camera.tapo_c100_hd_stream— snapshotstts.google_translate_en_com(withlanguage: "uk") — Ukrainian TTSmedia_player.home_assistant_voice_*— the speakerbinary_sensor.tapo_c120_person_detection— triggers- Aqara temperature sensors
- Power grid status sensor (via Yasno integration and Meross Smart Plug as a sensor)
- EcoFlow battery levels
Stack
- Home Assistant — the heart of it all
- HA Voice PE — TTS output speaker
- Tapo C120 + C100 — security cameras
- Meshtastic on Lilygo T-Echo (433MHz)
- Ollama — local AI models
- OpenClaw — AI agent framework
- Mac mini M4 — server on battery backup
Happy to answer questions about the HA setup.