We’ve been running into reliability issues on our main web hosting provider lately. They seem to have square thumbs and have had major data centre power issues over the last couple of months. Last time they touched our server it was to put in an extra backup drive. Managed to knock us offline for hours, despite our paying an extra $100/month for off site storage. The offsite storage totally inexplicably has the same limit for upload speed as we do from our own offices. I would have thought that our dedicated server host would have the foresight to have a fat pipe open to their offsite backup in order to be able to put clients back online faster.
I was happy that we’d already put in an automated backup routine to our own office. We have the bandwidth available for dailies and use it.
coding horror backup horror
Unlike our hosting provider, who is attempting to squirrel out of their SLA agreements, we gave 100% refunds for hosting in January as when Foliovision promises reliable service, we provide it.
Imagine my shock when exploring further backup options for servers, I learned that Jeff Atwood, author of Coding Horror and founder of Stackoverflow lost his entire Coding Horror archives one year ago:
looks like 100% data loss. thanks.. crystaltech.. :(
you sort of assume your hosting company is competent. That’s not a safe assumption in my experience
looks like it’s 100% internet search caches for recovery. Any tips on recovering images, which typically aren’t cached?
What happened is that Magwood was hosting with a relatively reliable host and they failed to do backups. And so did he. Here’s the tip and trick to data safety.
- Start with a reliable host.
- Use their backup system.
- Test their backup system.
- Backup to local.
- Test your backups
That means unpacking them and putting them back on the server. You have to be able to do this like a firedrill as when you need to do it, it won’t be a drill it will be a fire.
In this case, learning under pressure is not what you want to be doing.
How many iterations of your local backups you keep is up to you. Every once in a while they should go into offsite storage as well locally.
The internet is ephemeral. Your client’s data better not be.
cPanel has backups built-in but only to a single destination. If you want to back up to multiple destinations you’ll need to run a cron job outside of cPanel.
Now looking into additional backup systems like rdiff and DNS failover
Now if we could only find a brilliant Canadian nginx literate host in the meantime.
How we download our Cpanel backups on Mac
We programmed our own PHP scripts running on our office computers (since the PHP is running on our own computers, PHP memory limit and PHP execution time limit is not an issue) with MAMP Apache server to get the backups safely in our office. We didn’t liked the existing automated FTP tools for this purpose very much.
To schedule these backup routines weekly, we are using Lingeon. The PHP script also sends us an email if any of the backups fails. I that case we just download the backup files with Forklift.
Alec Kinnear
Alec has been helping businesses succeed online since 2000. Alec is an SEM expert with a background in advertising, as a former Head of Television for Grey Moscow and Senior Television Producer for Bates, Saatchi and Saatchi Russia.
Leave a Reply