Home Assistant

265 readers
3 users here now

Home Assistant is open source home automation that puts local control and privacy first. Powered by a worldwide community of tinkerers and DIY...

founded 2 years ago
MODERATORS
1
 
 
This is an automated archive made by the Lemmit Bot.

The original was posted on /r/homeassistant by /u/Gigant1000 on 2026-03-23 09:52:03+00:00.


I’ve been deep into the Shelly ecosystem for years and built a pretty large Home Assistant setup around it – currently running close to 90 devices across different generations.

At first, I really liked the flexibility and local control. But over time, things started to pile up:

  • repeated device dropouts
  • stability issues
  • firmware/update headaches
  • and a LOT of replacements

The breaking point for me were the Shelly Duo GU10 bulbs. I had 15 of them. Every single one had to be replaced multiple times. In the end, I removed them completely because the failure rate was just too high.

So I thought: okay, I’ve invested heavily, had real issues – maybe support will offer some kind of goodwill or at least acknowledge the scale.

I contacted Shelly support and explained everything.

Their response?

No exceptions. No individual solution. Just the standard program: 50% discount on selected replacements.

That’s it.

So basically:

It doesn’t matter if you have 2 devices or 90 devices.

I’m not even angry at this point – just disappointed.

I can understand that Gen1 products aren’t perfect. But what really matters is how a company treats long-term users when things go wrong.

For me, this was the moment where I decided:

I won’t expand my Shelly setup any further.

From now on, I’ll start replacing devices step by step with alternatives.

Curious:

Has anyone else here had similar experiences with larger Shelly setups?

And what did you switch to?

2
 
 
This is an automated archive made by the Lemmit Bot.

The original was posted on /r/homeassistant by /u/sfortis on 2026-03-23 07:00:39+00:00.

3
 
 
This is an automated archive made by the Lemmit Bot.

The original was posted on /r/homeassistant by /u/jfriend99 on 2026-03-23 04:43:08+00:00.


I'm a retired software developer and new to Home Assistant.  I want to do some more complicated automations that involve conditional logic from multiple sensors (temperature sensor, vibration/accelerometer sensor, two switches that can be controlled manually or by HA and affect the automation state, time of day, timers and some WiFi presence information.  

In asking Gemini how to do that in Home Assistant YAML, I ended up with YAML with embedded Jinga2 that I consider complicated to read, understand and maintain (compared to how I would express the logic in a more traditional programming language) and it didn't even yet have all the logic in it.  So, after some more research I find that there are many ways to implement your more complicated automation logic in HA:

  • YAML + Jinga2 - still don't quite understand the rationale behind using Jinga2
  • Pyscript - somewhat limited python code internal to HA, but could probably cover all the logic flow I need
  • Full Python - external process, communicates with HA via API
  • Node-Red - external process, communicates with HA via API (webSocket and REST), has visual flow design + ability to code in Javascript
  • Full C#/.NET environment - external process, communicates with HA via API
  • And there are probably more ...

My developer experience is C++ and Javascript and I have some familiarity with Python from writing small scripts.  My leaning right now would be Node-Red, but I don't really know what the tradeoffs are for going with an external environment or that one in particular.

For those of you with more complicated automations, are you just figuring out how to express them in the UI or in YAML, going to Pyscript for some things or learning one of the external coding environments?

For the external environments, are there any drawbacks of using them.  Latency?  Memory Use?  Are any of the choices more popular than the others for any particular reason?  If it matters I'm running the Home Assistant Green hardware.

4
 
 
This is an automated archive made by the Lemmit Bot.

The original was posted on /r/homeassistant by /u/NekuSoul on 2026-03-22 18:43:57+00:00.

5
 
 
This is an automated archive made by the Lemmit Bot.

The original was posted on /r/homeassistant by /u/tiochino on 2026-03-23 01:22:32+00:00.


I have a storage room in my house, and the water pump for my well resides in the storage room. I have a problem where I frequently turn the water on outside and forget about it, so I set up a monitor in Home Assistant to alert me when the well pump has been on for more than an hour. I used a microphone in the storage room and infer that the pump is running when the noise level is above a certain threshold. Last night it alerted at 2am and I went through the house looking for water left on, there was none. I went to the storage room, where my home server also happens to live. The well pump was quiet but the fan on my server was running full blast. That is what my "pump monitor" detected. I went back to bed, in the morning I discovered that all of my server CPUs were maxed out. A little investigation revealed that I stupidly had a frigate instance running on an unsecured and exposed port. A hacker found the vulnerability, rewrote my frigate config file, and was running a crypto mining process masquerading as a camera on my system. So, in essence, my pump monitor in Home Assistant alerted me that a hacker was exploiting my system.

6
 
 
This is an automated archive made by the Lemmit Bot.

The original was posted on /r/homeassistant by /u/imthenachoman on 2026-03-22 20:34:01+00:00.


I have my Reolink cameras set to record to a microSD card on the camera. If I ever need to see footage, I can get it from there.

I only want Frigate to detect things. I think I have my config.yml right but I'm not sure?

If it matters, my server is a Dell Inspiron 3880 with an i7-10700 CPU and 64 GB of ram. Attached is my Docker Compose file. I do not have any GPU or special detectors for Frigate installed. I want to use my CPU/GPU for Frigate. And everything is running as a Docker container.

mqtt:
 host: mosquitto
 port: 1883
 topic\_prefix: [redacted]
 client\_id: [redacted]

go2rtc:
 streams:
 e1\_zoom\_20\_sub:
 - rtsp://dingo:[bingo@192.168.30.20](mailto:bingo@192.168.30.20):554/h264Preview\_01\_sub
 e1\_zoom\_21\_sub:
 - rtsp://dingo:[bingo@192.168.30.21](mailto:bingo@192.168.30.21):554/h264Preview\_01\_sub
 duo\_22\_sub:
 - rtsp://dingo:[bingo@192.168.30.22](mailto:bingo@192.168.30.22):554/h264Preview\_01\_sub
 duo\_23\_sub:
 - rtsp://dingo:[bingo@192.168.30.23](mailto:bingo@192.168.30.23):554/h264Preview\_01\_sub
 duo\_24\_sub:
 - rtsp://dingo:[bingo@192.168.30.24](mailto:bingo@192.168.30.24):554/h264Preview\_01\_sub

detectors:
 cpu1:
 type: cpu

ffmpeg:
 hwaccel\_args: preset-vaapi

record:
 enabled: false

snapshots:
 enabled: false

objects:
 track:
 - person
 - car

cameras:

e1\_zoom\_20:
 ffmpeg:
 inputs:
 - path: rtsp://127.0.0.1:8554/e1\_zoom\_20\_sub
 input\_args: preset-rtsp-restream
 roles:
 - detect
 detect:
 width: 640
 height: 360
 fps: 5

e1\_zoom\_21:
 ffmpeg:
 inputs:
 - path: rtsp://127.0.0.1:8554/e1\_zoom\_21\_sub
 input\_args: preset-rtsp-restream
 roles:
 - detect
 detect:
 width: 640
 height: 360
 fps: 5

duo\_22:
 ffmpeg:
 inputs:
 - path: rtsp://127.0.0.1:8554/duo\_22\_sub
 input\_args: preset-rtsp-restream
 roles:
 - detect
 detect:
 width: 640
 height: 360
 fps: 5

duo\_23:
 ffmpeg:
 inputs:
 - path: rtsp://127.0.0.1:8554/duo\_23\_sub
 input\_args: preset-rtsp-restream
 roles:
 - detect
 detect:
 width: 640
 height: 360
 fps: 5

duo\_24:
 ffmpeg:
 inputs:
 - path: rtsp://127.0.0.1:8554/duo\_24\_sub
 input\_args: preset-rtsp-restream
 roles:
 - detect
 detect:
 width: 640
 height: 360
 fps: 5
7
 
 
This is an automated archive made by the Lemmit Bot.

The original was posted on /r/homeassistant by /u/Dry-Revenue-3479 on 2026-03-22 16:28:44+00:00.

8
 
 
This is an automated archive made by the Lemmit Bot.

The original was posted on /r/homeassistant by /u/JSTrucker on 2026-03-22 11:22:14+00:00.


Does anyone upload any sensors or information gathered from their home to data gathering sites….. willingly?

For example I have a Geiger counter (radiation [Few months]) on my roof that I send reading to 3 different sites. As well as ADS-B (Aircraft positions [Two years]) to FightRadar24 as well as 3 other places.

I’m just wondering if, what, and how long you have been doing it for

9
 
 
This is an automated archive made by the Lemmit Bot.

The original was posted on /r/homeassistant by /u/bukaro on 2026-03-22 12:46:35+00:00.


I kept forgetting my pills. Not dramatically, just the "it's 11pm and oh shit" kind. So I spent way more time building a fix than the problem deserved.

An ESP32 in a 3D printed case sits in my bathroom, hooked up to Home Assistant. A Zigbee vibration sensor on the pill box detects when I take them. Miss 10pm and the LED goes red, HA fires a notification. There's a physical button on the case too for when the sensor misses it.

https://codeberg.org/Buckaroo/PillsReminder/raw/branch/main/img/PXL_20260322_114526910.MP.jpg

https://codeberg.org/Buckaroo/PillsReminder/raw/branch/main/img/PXL_20260322_114532463.MP.jpg

How it works

The ESP32 runs ESPHome and subscribes directly to HA entities. Primary detection is a Zigbee vibration sensor on the pill box. ESPHome picks it up via platform: homeassistant and sets a taken_today flag in NVS so it survives reboots.

LED colours: blue = waiting, off = taken, red = missed. An HA automation triggers at my phone's next alarm time. If pills aren't taken, it turns the LED red and waits 30 minutes. Still not taken, it sends a Pushbullet notification and a Pixel Watch alert. Flag resets at midnight.

Two physical buttons on the case. The OK button is a fallback for when the vibration sensor misses a light touch — press it and taken_today gets set, LED turns off, same as the sensor. The Reset button undoes a false positive: knocked the pill box, something set it off by accident, press Reset and the flag clears, LED goes back to blue. Both fire events to HA so automations stay in sync.

Things I picked up building this:

  • ESPHome (love it)
  • Basic soldering (survived)
  • Fusion360 for the enclosure (still scared of it)
  • I will absolutely forget my pills without external intervention

Q: Which sensor? A cheap Tuya Zigbee vibration sensor from AliExpress.

Q: Why the box and buttons if the sensor does the job? Why not. Also buttons.

Code: ESPhome yaml, HA yaml, STLs, wiring, etc: codeberg.org/Buckaroo/PillsReminder

EDIT: PillBox picture https://codeberg.org/Buckaroo/PillsReminder/raw/branch/main/img/pillbox.jpg

10
 
 
This is an automated archive made by the Lemmit Bot.

The original was posted on /r/homeassistant by /u/tinyhurdles on 2026-03-21 19:22:43+00:00.

11
 
 
This is an automated archive made by the Lemmit Bot.

The original was posted on /r/homeassistant by /u/Puzzleheaded_Care156 on 2026-03-21 22:47:13+00:00.


Getting my set up figured out but there are a ton of options for brands to get. I’m leaning towards Aqara but is pricey.

12
 
 
This is an automated archive made by the Lemmit Bot.

The original was posted on /r/homeassistant by /u/dr_stutters on 2026-03-22 04:53:04+00:00.

13
 
 
This is an automated archive made by the Lemmit Bot.

The original was posted on /r/homeassistant by /u/Kdcius on 2026-03-22 00:05:09+00:00.

14
 
 
This is an automated archive made by the Lemmit Bot.

The original was posted on /r/homeassistant by /u/Bran_Solo on 2026-03-21 18:15:19+00:00.


I'm a few years out of the home automation game and just closed on a new house. It looks like since I've been away, everyone has moved from Hubitat to Home Assistant and there's some great dedicated hardware for it now. Awesome!

When I was last a homeowner I had some Eufy outdoor cameras + floodlights that worked reasonable well, except the people detection could not be used as as trigger for events on other devices. My new home has a very dark driveway area and I'd like to be able to implement some logic like "when a human is detected, ensure that several lights are turned on for at least x seconds after human detection ends" and there's a pretty good chance some of the lights will come from a different ecosystem like Lutron or TP-Link.

I've come up dry with some preliminary research. Is there a go-to camera system recommended for this? I've seen some workarounds to try to get this stuff to work with Eufy but it feels a bit hacky and I'm seeing mixed reports about reliability and latency.

15
 
 
This is an automated archive made by the Lemmit Bot.

The original was posted on /r/homeassistant by /u/hasntexplodedyet on 2026-03-21 16:30:56+00:00.


Full disclosure up front:

I used Claude Code to write the bulk of this code. I set the architecture, design, and reviewed code that it wrote and conducted significant testing. But this was written by AI tools. I have years of development experience but use AI now to complete projects like this which I simply do not have the time for otherwise.

In order for this tool to function, you will need to provide it Open Router and ElevenLabs API keys, and it needs to generate LLM / gen AI content.

This post was not written by AI. Real human here.

TL;DR: This app enables you to turn your house into an interactive playground for kids, using your smart devices. Specifically lights, doors, and objects which children can safely interact with. You can run this in three different modes: on a phone/PC using local speakers; using network-attached audio in Home Assistant (sonos, google home, alexa); using Apple TV. The app creates missions/challenges, tells the kids what to do, and sets a timer and automatically polls Home Assistant to determine when they complete the mission. At the end it provides a "game summary".

What's included:

  • Game framework, dockerized
  • Dev environment with a pre-configured dev Home Assistant instance for testing / demo for those who (understandably) prefer to not just YOLO it on their Home Assistant.
  • Source code for Apple TV app - you can build in Xcode and push to your Apple TV (apple limits it to living for 7 days on your TV each push unless you are a paid developer)

What you need to do:

  • Connect to your Home Assistant; I'd suggest making a new user and generating a token for that user.
  • The app will scan your Home Assistant devices and allow you to select which speakers (if any) you want to use for challenges
  • The app will scan your Home Assistance devices and propose challenges (e.g. turn the kitchen lights on; open the back door; turn the fan and light on in the kids bedroom) which YOU review and approve
  • Generate media - the app does this for you, ~~but you need to provide Open Router and ElevenLabs API keys. It will cost somewhere around $3.50 or so to generate all the content. You can re-generate most/all of it if you wish to. Most of the TTS audio text can be modified.~~ Update: just Gemini API keys directly, no more ElevenLabs or Open Router. You'll have better luck being on Tier 2 billing tier in Gemini, but Tier 1 will be ok as well with selective back-offs on rate limiting.

This is just a personal project but I figured it is a fun way to get kids moving and active inside the house, so I put a little extra polish on it and share it because I haven't found many other projects like this.

I of course realize that this is pretty niche and most people won't have a need for this or even want to use it. And that's ok!

Screenshots and more information in the Readme on the GitHub: https://github.com/dcnoren/mission-control

16
 
 
This is an automated archive made by the Lemmit Bot.

The original was posted on /r/homeassistant by /u/nyc2pit on 2026-03-21 19:38:57+00:00.

17
 
 
This is an automated archive made by the Lemmit Bot.

The original was posted on /r/homeassistant by /u/seriousthrillissues on 2026-03-21 17:53:54+00:00.

18
 
 
This is an automated archive made by the Lemmit Bot.

The original was posted on /r/homeassistant by /u/rapax on 2026-03-21 14:19:28+00:00.


We have a nice little pond in our backyard, with newts and toads and all kinds of local wildlife. In summer, it's the jewel of our back yard. Unfortunately, evaporation often leads to low water levels, so we have to occasionally top it up.

I have a nice setup with a zigbee water valve that I can open and close through HA, no problem. What I don't have is a reliable water level sensor. That is obviously something that needs fixing, so I got to thinking about ways of measuring the water level (boundary condition: the sensor must not be visible to comply with WAF criteria).

  • Floaters don't work, too many plants and algae etc.
  • Ultrasonic needs a minimum distance of 10-20cm and while there is little wooden bridge over the pond, it's only about 10cm from the water, so not enough room to mount a sensor

So I came up with the idea of hiding a little wifi camera under the bridge and mounting a broken piece of a yard stick in front of it. Then use AI vision to read the water level. Worked beautifully at first: a little automation to wake up the camera every three hours, turn on the LED floodlight, snap a picture, pass it to Gemini, get back the water level, turn on the water if it's too low. Beautiful.

That is, until our helpful little spiders decided that what that setup really needed was a bunch of cobwebs right in front of the lens, so that they could reflect the light everywhere and make reading the water level impossible.

Ok, it's their home, I'll find another way.

Let's revisit that idea with the ultrasound sensor. Yes, there's no enough room to mount it vertically under the bridge, but what about horizontally? I added a 45° reflector panel under the middle of the panel, and placed a Shelly BLU distance sensor on a flat rock under the end of the bridge. I intentionally didn't mount the sensor to the bridge, because I wanted to be able to just reach under there and grab it, whenever the battery needs charging. Brilliant. Works perfectly well...for a few days.

It's now early spring, and the frogs are having a party. There's more frogspawn in the pond every morning and at night, they're loudly enjoying themselves. Cool. Suddenly, the vibration sensor of the Shelly triggers, then again, and the distance becomes implausible. Then signal is lost.

Turns out, that the Shelly is IP64 rated, but that doesn't mean it likes being kicked into the pond by horny frogs. So now the sensor is dead. 2:0 for the critters.

I have now ordered one of these zigbee sensors. Apparently they have a shorter minimal distance, and are mains powered, so no battery needed. I wonder what which little beastie will foil my plans this time.

19
 
 
This is an automated archive made by the Lemmit Bot.

The original was posted on /r/homeassistant by /u/hometechgeek on 2026-03-21 10:48:42+00:00.

20
 
 
This is an automated archive made by the Lemmit Bot.

The original was posted on /r/homeassistant by /u/ScientistEasy1328 on 2026-03-21 08:19:28+00:00.


Running HA on a Raspberry Pi, CasaOS + Docker. Started looking at connecting Claude to the whole thing there’s the official MCP integration (added in 2025.2) and the community ha-mcp project. Haven’t set up either yet.

Question for people who’ve actually done this: does it change how you use HA day-to-day, or is it mostly a demo? And practically official integration or ha-mcp? I’m self-hosted without Nabu Casa, so remote access runs through my own tunnel. Curious if that breaks anything or if

21
 
 
This is an automated archive made by the Lemmit Bot.

The original was posted on /r/homeassistant by /u/trevah1200 on 2026-03-21 01:07:56+00:00.

22
 
 
This is an automated archive made by the Lemmit Bot.

The original was posted on /r/homeassistant by /u/mattchew0 on 2026-03-21 00:40:40+00:00.


Is there an integration for Home Assistant you wish existed but haven’t been able to find? Maybe it’s a self hosted service or a product you wish had an integration.

Looking for my next project.

23
 
 
This is an automated archive made by the Lemmit Bot.

The original was posted on /r/homeassistant by /u/i_bri on 2026-03-20 22:47:09+00:00.

24
 
 
This is an automated archive made by the Lemmit Bot.

The original was posted on /r/homeassistant by /u/Forward-Arm3051 on 2026-03-20 20:51:01+00:00.

Original Title: Entity Manager update v2.17.0 A powerful, feature-rich Home Assistant integration for managing entities across all your integrations. View, enable, disable, rename, analyze, and bulk-manage entities and firmware updates from a single modern interface.

25
 
 
This is an automated archive made by the Lemmit Bot.

The original was posted on /r/homeassistant by /u/hometechgeek on 2026-03-20 19:54:56+00:00.

view more: next ›