We’ve both got a software dev background, so it wasn’t a particularly difficult solution to sell, as soon as we came up with it it was very much a “oh duh, why didn’t one of us think of that way earlier”
We’ve both got a software dev background, so it wasn’t a particularly difficult solution to sell, as soon as we came up with it it was very much a “oh duh, why didn’t one of us think of that way earlier”
My partner and I use a git repository on our self-hosted gitea instance for household management.
Issue tracker and kanban boards for task management, wiki for documentation, and some infrastructure components are version controlled in the repo itself.
Home Assistant (also self-hosted) provides the ability to easily and automatically create issues based on schedules and sensor data, like creating a git issue when when weather conditions tomorrow may necessitate checking this afternoon that nothing gets left out in the rain.
Matrix (also self-hosted) lets Gitea and Home Assistant bully us into remembering to do things we might have forgotten. (Send a second notification if the washer finished 15 minutes ago, but the dryer never started)
It’s been fantsstic being able to create git issues for honey-dos as well as having the automations for creating issues for recurring tasks. “Hey we need to take X to the vet for Y sometime next week” “Oh yeah, can you go ahead and put in a ticket?” And vice versa.
I have looked at the ROI for getting more efficient kit and ended up discovering that going for something like a low-idle-power-draw system like a NUC or thin client and a disk enclosure has a return period on the order of multiple years.
Based on that information, I’ve instead put that money towards lower hanging fruit in the form of upgrading older inefficient appliances and adding multi-zone temperature control for power savings.
The energy savings I’ve been able to make based on long-term energy use data collected via Home Assistant has more than offset all of the electricity I’ve ever used to power the system itself.
~120W with an old server motherboard and 6 spinning drives (42TB of storage overall).
Currently running Nextcloud, Home Assistant, Gitea, Matrix, Jellyfin, Lemmy, Mastodon, Vaultwarden, and a bunch of other smaller stuff alongside storing a few months worth of surveillance footage, so ~$12/month in power certainly ain’t a bad deal versus paying for hosted versions of even a fraction of those services.
Pretty darn well. I actually needed to do some maintenance on the server earlier today so I just migrated all of the VMs over to my desktop, did the server maintenance, and then moved the VMs back over to the server, all while live and functioning. Running ping in the background looks like it missed a handful of pings as the switches figured their life out and then was right back where they were; not even long enough for uptime-kuma to notice.
they need to be using shared storage for disks
You can perform a live migration without shared storage with libvirt
Stop using a rolling release distro for something that you actually rely on day-to-day.
Yeah, rolling release on a server sounds horrifying. You couldn’t pay me enough to live that nightmare.
There’s a reason “enterprise” server distros exist. Install LTS release once every 2, 4, or 5 years depending on taste, login to update as you remember the machine is even running an OS, and just generally forget the machine exists for several years at a time.
OpenWRT, because it has a nice interface, runs on half a toaster, and I’ve yet to find something that I need it do that it couldn’t do but OPNSense could.
I did try PFSense many years back and it just seemed overly complicated and generally flaky. I had trouble setting it up as tinc vpn client despite that being a trivial task in OpenWRT, so I switched back.
Mastodon is a hellavalot easier to self-host then Lemmy, so if you got Lemmy running reliably then Mastodon would be a breeze.
My partner and I use a pinned issue as our grocery list on our git repo for managing our household. All running on top of a self-hosted gitea instance.
Great for being able to create git issues for honey-dos as well as having automations for creating issues for recurring tasks.
“Hey we need to take X to the vet for Y sometime next week” “Oh yeah, can you go ahead and put in a ticket?” Amd vice versa
SBCs like the RPi are kind of awkwardly in-between a microcontroller like an Arduino or ESP32 that you can actually trust with handling GPIO and data logging, and a real Linux system that can actually do meaningful computational work.
Pretty much the only task I’ve found them reliably appropriate for is running OctoPrint, really really light computer vision tasks for robotics, or hooking up an RTL-SDR to use as a police/HAM scanner. Outside of those, it’s so much easier to use either a cheaper and more reliable MCU or a much more powerful old laptop or desktop.
Not actually used it. I started off doing local backups, B2 was an add-on way later down the road.
I only do automated copy
to B2 from the local archive, no automated sync
, which as far as I understand should be non-destructive with versioning enabled.
If I need to prune, etc. I run will manually sync and then immediately restic check --read-data
from a fast VPS to verify B2 version afterwards.
An external hard drive is a lot faster than my internet connection and helps fulfill 3-2-1 requirements.
I backup to an external drive and then rclone copy that up to backblaze B2
I use restic with a local external drive that is then synced to backblaze b2 via rclone.
I also really liked Google Keep. Carnet was at one point a decent drop-in replacement on Android+Nextcloud, but it got progressively bitrotted over time and now I just use Nextcloud Notes until I find something better.
Yep, currently using restic with Backblaze B2 via rclone
Washing machine is a threshold sensor in Home Assistant on the power draw entity on a sonoff s31 smart outlet flashed w/ ESPHome.
Dryer is another threshold sensor on a current clamp connected to an ESP32 running ESPHome.