Optimizing Camera Resolution Based on Network Connection

Hey everyone, I’ve been working on a project to optimize my camera setup based on whether I’m connected to my home network or accessing it remotely. I wanted to share my experience and see if anyone has similar setups or tips to make this even better!

So, here’s the deal: I use DuckDNS to access my Home Assistant setup both at home and on the go. When I’m inside my LAN, my cameras stream in high resolution with smooth frame rates. But when I’m outside, I’d like to switch to a lower resolution to save bandwidth. It’s all about balancing quality and performance!

I thought about using a switch entity to toggle between high and low resolution streams. The idea is to dynamically adjust the RTSP path based on my location. But I’m wondering if there’s a smarter way to automate this without manual intervention. Maybe using geolocation or detecting the network connection automatically?

If anyone has tackled something similar or has a clever workaround, I’d love to hear about it! Whether it’s through automations, custom scripts, or even built-in features I might have missed, your insights would be super helpful.

Let’s make our camera setups as smart as they look! :rocket: