So, I really need to start backing up my files because I have a lot of stuff on my computer right now that I cannot bear to lose. I have no problem picking up an extra hard drive or installing it (either external or internal), but I want good reliable software that will back up my files automatically with versioning that will offer me some kind of protection against ransomware. I’m not overly worried about ransomware in the sense that I’m a shitty target for people who actually want money, but on the off-chance I’m afflicted with this particular scourge, I want to know that I can recover most of my less recent work. It’s not clear to me how any file backup system can actually offer anti-ransomware protection, because any physical media is going to be susceptible to being encrypted, versioning or not.
So do I need to have a manual backup schedule with an air gap? What are my options here? While it is my ambition to be the kind of person who can write scripts for their computer and have daemons scurrying around doing my bidding, I am not one of those people. This does need to be some minimum level of easy.
I just have a bog standard external USB drive and do a semi regular drag and drop of my data that needs backing up.
There are some new fancy ones you can set up as a NAS if you like which would help with automation.
ETA and I am a sysadmin for my day job but fiddling around with backup programs and that kind of mess when all I need is a simple copy of folder X, Y, and Z on whatever schedule means plugging in the drive and drag and drop works a lot easier than setting up some automated things. The KISS rule is a good one to follow most of the time.
On top of whatever system you get working, I’d suggest another layer: Plug the external drive into a timer that’s usually off. If you’re crafty, set to backup upon mounting. If not, at least it’ll probably be offline if ransom/crypto/under ware hits.
PS watch out for crypto underwear, it’ll obfuscate your bits!
You don’t need an air gap as long as your backup system keeps versioned copies of your files, so that you can restore files from a specific date in the past. Dropbox, for example, offers this option only on paid accounts, I think; so the free version is not useful for ransomware scenarios. Most of these “cloud drive”/“cloud backup” solutions have similar policies; but if you are willing to pay, they are probably the easiest option.
Second-easiest option is to have a local NAS or USB drive where you copy data periodically. Depending on when the ransomware hits you, these might also be ineffective (it doesn’t matter if you have previous versions of files, if the ransomware can get to them…).
If you are a Mac user, Time Capsule will do versioning automatically and should (in theory) be pretty safe, although there are rumours of cryptolocker-style software trying to break into TC as well (unsuccessfully, for now, reportedly).
I basically use both “cloud drives” and a local Time Capsule (well, a Linux faking to be one, but it’s the same thing), plus periodic pushing to Amazon Glacier.
The best backup systems are the ones you don’t need to pay attention to.
Duplicati, which is based on linux’s duplicity (a front-end to rsync), works remarkably well, and is pretty idiot-friendly. I keep an SD card in my laptop and use duplicati to periodically backup critical directories to that card, but it can also backup to some cloud services and to NAS.
Since I’m risk-averse, I also backup some directories elsewhere using other services.
What do you think of Duplicati for backup from gmail and Google Drive accounts — esp. if there’s a lot (Is 15GB a lot?) to copy and store the first time it runs.
Duplicati is pretty fast, and 15GB is not very much. (The SD card in my laptop is 32GB, and you can get a 2TB external drive for around $80 nowadays.) I don’t think Duplicati can backup from the cloud, but if you are talking about downloading files from Google to your drive then backing them up somewhere, I suppose it would be as fast as anything.
This is the part that worries me. When I moved last, figuring out how to export copies of gmail took a lonnnng time. And I hope I never have to try and figure out how to read them because it’s like a long ribbon of what I imagine to be free-floating database content. I’m not a tech professional, but my hope was that, once supplied with the appropriate account credentials, Duplicati could burrow into the gmail and Google Drive account and, with a little prompting, grab copies of all of it.
It’s good to hear 15GB isn’t much for copying. It feels like a lot of content when I think about sorting through it manually. Of course, that’s without the magic of digital processing and memory so . . . some good news.
I came here to recommend rsync with a front end but @d_r beat me to it so I’ll second the recommendation. I’m kind of silly so I tend to do these things the hard way (no front end ).
I’ll also paraphrase the 3-2-1 rule of backups …
If you have 3 different backups in 2 different physical locations then in a disaster you’ll have at least 1 copy of your data. Since a regular trip to swap media in a safety deposit box is a hassle most of us would mostly skip, a internet backup service (one where you encrypt the data before you send it up) really is a necessity.
No, I’m afraid it is just a file/directory backup system. I don’t use gmail, but I assume there is lots of software that can download your email to a local repository for safe keeping, duplicati could then back it up from there. (Quick search) maybe Gmvault?
I’ll add my support for a utility that uses rsync for a backend. It’s solid and recovers well when file transfer is interrupted. I’m in the script it yourself and tunnel through ssh to a NAS school, so I can’t really help on the front end recommendation.
Here’s what I do, FWIW. I’m on a Macbook Pro, with a 1TB hard drive (750G of which I care about, the other ~250G is my Windows gaming/cross-browser-testing partition.)
First line of defence is a 3TB external device backed up to with Time Machine. The extra size ensures I can keep tonnes of deleted files and old versions of stuff. If an image gets corrupted, for example, I can likely pull a good version off of Time Machine.
Second line of defence is Backblaze. Keep an eye on BoingBoing, their Stack Social thingy sometimes has deals on a year of Backblaze. I used a promotion on BB to set up my mum. BB is great, since you can do individual file restores from the web interface or phone app.
Third line of defence is a full system bootable clone done weekly. For that, I have two bare hard drives I connect with a USB-to-SATA interface. I keep 'em in shock cases. One drive is done one week, the other is done the next. That’s your ransomware protection there. These things usually sit on your machine for less than a week before they do something ugly.
On a Mac, it’s pretty simple to set full disk encryption for your backups with File Vault. I’m not sure what the Windows set up is.
Thanks to everyone for making some pretty good suggestions! I like the idea of an air gap, but I might make it weekly or biweekly. My uni gives me great access to Box, I just can’t rely on it after I leave.My big hesitation with cloud backup is perhaps silly but I worry about service interruption because of a billing error.
But it’s clear that everyone has a different approach and what I’m hearing loud and clear is “one backup isn’t really a backup.”
I have a MBP and three off-site users (Mom, Dad, little bro) that are backed up to my local file server via Crashplan (I use the free personal service as my local server has enough space for the backups and then some). The app does allow for encrypting the backup, and management of the clients can be accomplished from the CP website or the app itself. Other than that, I’ve got an external drive that I plug in every now and again for use with Time Capsule. The NAS and the external drive are both on the property, so I don’t necessarily have off-site backup, although I’ve been toying with the idea of finally paying CP as a full-user.
Do you have a bash script that connects to Google Drive and Google Mail accounts, copy and returns all the files in a directory to a directory on a local drive.
I can map out conceptually what operations need to occur. That’s a long way from knowing the syntax and diplomatic needed to navigate all the way up and back through all the “layers” and return with some files.
I have a pair of external drives and once a day (typically) I have a piece of freeware that mirrors one to the other. In addition to that there’s a cloud backup that updates continuously.
So it’s not instant, but it is a good chunk of my images mirrored both at home and far, far away for any local disaster. The nice thing is that I can flip a switch on the second drive and it won’t update if you want to demount it anytime you want. I can also pull from the cloud service anytime anywhere if I have some idea of what I’m looking for.