Hi,
since we get fast internet here during the next weeks, an Off-site backup gets somehow realistically.
Since my Parents-in-law also get fast internet, I discussed with them, to be able to host some kind of Off-site backup at their house.
Obviously, it should consume as little power as possible and should be as easy as possible to install.
My current plan would be to get an old Dell optiplex in SFF and put a 18TB HDD into it. The i just have to connect it to LAN and power at their site.
The machine will be always on and connected via WireGuard to my homeserver.
Any better ideas? Anything I missed?
Literally less than 24h ago: https://www.reddit.com/r/DataHoarder/comments/17y8omd/small_offsite_backup/
Wow, thanks. I wonder how I missed that.
I do not see any issues with your configuration.
You can really use anything there rclone with any cloud provider, backblaze personal and etc. You can also encrypt the data before uploading it to the cloud, so it won’t be accessible to cloud provider, as alternative to it starwind vtl can be used to upload the data in parts.
I have a very similar setup: an old PC with lots of big disk drives at my mom’s house.
However, instead of leaving it powered on 24/7, I have it configured it to power itself on at a particular day and time every week. My backup script connects to it, mounts the disks, copies the data, and then send a shutdown command.
This way, the computer isn’t running 24/7 (leaving it open to hacking attempts) , and the disks are only spinning when they’re in use. It also saves my mom some money on her power bill :)
Just to be safe, I still have the PC plugged into a UPS.
Really interested in hearing how you would go about pulling this off. It sounds exactly like what I would want to do with certain shares on my Unraid server. Is it literally a script describing what folders to download through wireguard/tailscale/etc at a given moment? Or do you use something like syncthing, with the added instructions to shut the pc down when done?
More like the latter, except I use rsync (running over SSH) so as to minimize the amount of traffic. I understand syncthing works in a similar manner, but I haven’t tried it out (I’ve heard good things about it though).
I really thought about doing this. A Dell Optiplex has this feature in BIOS as far as I know.
So, it should be really simple from this point:
- Run systemd timer at the time of the night where the remote machine should be started
- Ping the machine on it’s wireguard interface until it answers
- SSH into the machine and do some basic checks
- Run backup script to do backup via borg to remote repository
- Shutdown the remote machine
Optionally, keep a config file for the script on the local machine. For eample, I do not want to shutdown the machine after the backup one single time, to be able to do some updates of the remote machine the next morning and shutdown it by hand afterwards.