I’m a retired Unix admin. It was my job from the early '90s until the mid '10s. I’ve kept somewhat current ever since by running various machines at home. So far I’ve managed to avoid using Docker at home even though I have a decent understanding of how it works - I stopped being a sysadmin in the mid '10s, I still worked for a technology company and did plenty of “interesting” reading and training.
It seems that more and more stuff that I want to run at home is being delivered as Docker-first and I have to really go out of my way to find a non-Docker install.
I’m thinking it’s no longer a fad and I should invest some time getting comfortable with it?
Because it seems overkill for a home server. Up until recently all I ran was Samba and a torrent daemon. Why would I install another layer of overhead to manage two applications on one server?
Because the overhead is practically none, barring the extra disk space. Maybe it’s not worth using it for Samba and Transmission. But involve OpenVPN for Transmission in the mix and things get a lot more complicated if Samba has to keep serving LAN and Transmission has to stop whenever OpenVPN stops. If instead you grab this, the problem is solved by writing one 20-line docker-compose.yml and doing
docker-compose up -d
:version: '3.3' services: transmission-openvpn: cap_add: - NET_ADMIN volumes: - '/your/storage/path/:/data' - '/your/config/path/:/config' environment: - OPENVPN_PROVIDER=PIA - OPENVPN_CONFIG=france - OPENVPN_USERNAME=user - OPENVPN_PASSWORD=pass - LOCAL_NETWORK=192.168.0.0/16 logging: driver: json-file options: max-size: 10m ports: - '9091:9091' restart: on-failure image: haugene/transmission-openvpn
A benefit of Docker’s that helps even with a single-service deployment is the the packaging side. It allows for running near-arbitrary service versions on top of your host OS, stale, stable, bleeding edge or anything in-between.