Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I purchased an external 4TB SSD and I make a point of backing my stuff up offline (downloading via Takeout) every few weeks just in case some catastrophic accident happens on Google's end that results in my data being permanently deleted.

I store the local drive in a "fireproof" safe (note: fireproof is a bit of a misnomer, since stuff can still burn in a fireproof safe, though it's a better bet than not having one in the case that your home/apartment catches fire).

I also back up to Amazon Glacier as a third offsite backup. One could also use S3 for this, though it might be a bit more expensive than Glacier.



Wow, we've come back full circle if we're backing up our cloud data to a local SSD :-)


IMHO, that has always been a good idea.

It’s basically the good old 3-2-1 backup rule: Three copies of your data (your production data and two backups) on two different media, with one copy off-site.


I’d assume having your data in two cloud services is already good enough, assuming they are different companies in different locations.


as an archivist 3-2-2-1 is better. The second 2 indicates that the two backups should be on two different brands of media in case of a catastrophic design flaw.


3-2-1 strategy is also a good idea. The SSD in a safe place is great but in case of deep catastrophe, I prefer some sort of easy-access cloud solution, preferably one that's zero knowledge, either because it has it baked in or because you use something like cryptomator.

For cloud backup of anything i've never trusted Google for shit, even aside from its grotesquely parasitic "privacy" policies (a complete joke of a word for anything to do with that company).

Instead, so far I've tried several and maybe unexpectedly, the most easy to work with and simply functional across multiple machines and several external drives has been SpiderOak. Its desktop interface is shit, but the overall service is easy to use and has never failed me one so far, and it constantly, incrementally conducts deduplicated backups of even minor file/folder changes as long as it's running.

As a second option, Arq has been nice too, though its UI is also rather clumsy.


I keep running out of space because of the photos that my phone keeps automatically uploading to Google Photos. I've done a couple of downloads now, but now it's just sitting on my hard drive. I clearly need a better place to store my photos. Preferably somewhere where we can look at them, enjoy them and organise them.

I've been thinking about getting a NAS, but I feel like I probably need something more than that. It's all pretty new ground to me, though. I think I want to run my own "cloud" from my home that my phone can sync with, helping me organise this stuff and show it to people I want to share it with. But controlled by me and not by Google or some other cloud service.


Synology has a good suite of products here that run on their NAS devices. Moments is a Google Photos clone - not quite as good, but definitely good enough.


Glacier should be sufficient for catastrophic accidents. It's cheap to store ($1/TB/mon), the traffic costs are damn expensive ($90/TB) but you won't need to pay for it except in catastrpohic cases. However you will need to somehow make sure you will be able to access the data when you do need it.

For online storage I've found cloud S3 storages to be substantially better than customer-faceing services like Google Drive, OneDrive or Dropbox. Amazon S3 may be expensive but you can use Backblaze B2 for $6/TB/month which is as good and as fast. Then use rclone as file browser. You can share single files by generating download link so others can download at 1Gbps no hidden limits.


> but you won't need to pay for it except in catastrpohic cases.

Is an untested backup really a backup?


Test the restore before uploading it, then. AWS is trustworthy at accepting and keeping your data.


I've looked into glacier as a clone of my NAS before but found it unjustifiably expensive. Do you mind sharing how much you pay?


Look at Wasabi. S3 interface but cheaper


The real question is whether it’s better off in the fireproof safe that you most likely will not have time to access in an actual emergency, or directly in your emergency “go-bag”.


If your failure mode is "Google loses my data", putting it in the secure safe is best. It's not likely Google will have data loss on the same day your house burns down.

What emergency could affect both you and Google, but not lead to the kind of apocalyptic situation where you'd prefer an extra first aid kit, food or ammunition to a hard drive in your go bag? Perhaps if you're on the US West Coast, a huge earthquake is a legitimate worry here.


it could though a couple of months after house burns down and you are kinda busy these two months and making a new onsite copy is not the main priority.


I imagine Google does cross-region replication


Secondary question, which of those is more likely to survive a fire while you're at work or the grocery store?


Suggestion: with these topics always start with working on your threat model and risk matrix and let everything else come out of that. This helps focus the effort and the mitigations on what is relevant for you and what you're protecting.


Do you use Takeout or what method do you use every few weeks to grab all your data from cloud services including Google?


You can use Rsync to copy everything from Google Drive as a cron job. If you want copies of your Google Docs and Sheets you'll need something like SyncDocs to download and convert them to MS Office format.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: