This is the first in a series of articles about a new backup process I have implemented for my home network. In this first article I'll cover background information and why I choose a non-traditional backup process. In future articles, I'll cover my implementation of this system for backing up  on Windows.

Note: Currently the custom backup system described here is not publicly available. That is something that I am looking at doing in the future but as it exists now is it not in a state that would be usable by the general public.

The nature of traditional backup processes

I’ve been thinking off and on for quite some time about setting up a new backup system for my home network. I started down this path for a better backup solution because I was not really happy with my existing NT Backup scripts. At first I thought I would continue to leverage NT Backup with a better set of scripts to handle backup rotations, but the more I thought about it the more I became convinced that this was not the way to build a backup system for the future.

Traditional backup programs are still largely built on concepts from the days when everyone backed up their data to cheap backup media (i.e. discs or tape). They are typically designed to backup all information to a set of backup media. Even though most backup programs now allow you to back up data to external hard disks or network locations, most still create monolithic backup files. Getting to the data once it's on the backup media usually involves a restore process that moves the individual file data back to a hard disk.

Nowadays however hard disks are cheap and online storage is practically getting cheaper by the minute. Backing up using a process that creates large, monolithic backup files just doesn't make as much sense anymore. Hard drives are very good as storing individual files and online backups services are most efficient when they can upload small incremental files changes rather than larger monolithic backup files.

After this realization I started to look for possible alternatives to the monolithic backup process. Most of what I found offered little more than what NT Backup and some clever scripts had to offer. There were some notable exceptions and the solution I ultimately settled on is largely inspired by one of these exceptions, a tool named rsync. However once I made the decision to move away from NT Backup and a traditional, monolithic backup system things got a lot more complicated.

Along the way I've learned new technologies such as Microsoft's new PowerShell scripting language, how to work with the Windows Volume Shadow Copy Service (VSS), digging into Window's GLOBALROOT device namespace, and insights into the pitfalls of .NET/CLR and COM interop. I even dived pretty deep into cygwin development at one point along the way.

A more complicated setup requires a more complicated backup plan

I don’t have what one would call a typical home network. On my home network I have several laptops, workstations, a media computer, and a server. My server is a Windows Server 2003 acting as a domain controller and I use domain accounts for all computer logins. I also use several advanced features only available to computers participating in an AD domain environment like user account folder redirections and domain based group policy settings for management. This server also runs Microsoft Exchange server which is set up to automatically download POP email for all users and make it available from multiple email clients. I even have a few SQL Servers running here and there although so far this is not data that I have been too concerned with (mostly development projects with test data).

This configuration has allowed me to set up an environment where sharing resources is a cornerstone to the way we use our computers. Exchange allows access to our email from anywhere with rich calendar and address book support, whether simultaneously from multiple computers, over the web remotely (via the OWA client), or in offline mode on our computers. Folder redirection and offline files allow us to share and sync data effortlessly with shared desktops, documents, favorites, or any other resource on the network including shared application. I can freely move from computer to computer, from inside the network to outside and still have access to my email, data, and other network resources anywhere I go whether online or offline. If it sounds like a complicated set up that's is because it is, but from a management point of view once it was set up I spend very little time keeping it running. The downside to this configuration is that my data backup scenario is a bit more complicated that just copying files to a CD/DVD.

My primary goal of course is to not lose a user's data; the documents, photos, email, and other things that we all create. My secondary goal however is to preserve the state of the entire system. By state of the entire system I mean all the settings and configurations for each user account, on each computer, as well as the OS configuration of each computer. As I said before it is a complicated system and if something goes wrong I want to avoid having to rebuild systems from scratch if at all possible. My goal is to get back up and running as quickly as possible.

Running a domain controller also complicates backup strategies. Since the login accounts are all Active Directory domain accounts , if the state of my server is lost then so are the user accounts. Email is stored centrally in the Exchange database, which is tied to these AD accounts and requires a special backup process as well. All together this necessitates something more than copying data files to a disc or an external hard disk.

My overall goal in creating a backup system is to both data preservation and having as little downtime as possible when something does goes wrong (and it will at some point). But I also want this to be as automatic as possible, something I can just set and largely forget. And I want it to be resilient too. These are bold goals for such a complicated setup and a one-man IT shop. Yet my previous solution of scripting NT Backup, while simplistic, met some of these goals but failed in many of these as well.

Why my previous traditional backup solution wasn't good enough

My previous solution provided basic data and system state protection. I used simple scripts to control NT Backup to back up both the user data and the Windows' system state nightly for each computer. I also used NT Backup to backup up my Exchange server's data as well. I backed everything up to a secondary disk on my network. However I only ever kept the previous backup set so I had a history of exactly two backup cycles. By doing so however, I had at least three copies of everything stored in two separate places.

My data volume however presented issues right from the start. I have a lot of data on one workstation in particular, mostly very large photo files. The data set size for that machine alone is currently 40GB if I don't count DV video, which effectively rules out running nightly full backups and keeping a lot of backup history. Incremental backups offer one solution to this problem but I am not a fan of creating long chains of incremental backup sets as it makes the restore process more complicated and time consuming to get data back. They also have other drawbacks in that if one of the incremental backups sets fails, the chain is broken and you could lose the only backup copy of a particular file. As a compromise I eventually settled on a simple rotation scheme of full backups and differential backups. This made for a restore process that was at most two steps, a full backup restore followed by a single differential backup restore while also ensuring that I had some file change history preserved.

While my old scheme provided basic data protection I felt that it was lacking in several ways. First off was that it didn’t keep very much history and I wanted to keep more. It’s easy to not notice that something has gone wrong for quite a while with that volume of data. If a single file has been changed or lost chances are that I will know immediately as I probably caused it myself. But it’s not always that simple; files can get corrupted when hard drives have minor and unnoticeable failures. Other actions can also have consequences that aren’t always immediately apparent too. Any good backup system should have a reasonable amount of history.

Secondly, a lot of redundant data was flying around each night. Differential backups are not very efficient. I had scheduled my backups to run in a staggered sequence so that they were not competing for the server’s bandwidth all at once. Still, so much duplicate data was getting copied each night that the process took around 3 hours and sometimes a lot longer if a full backup of any machine was triggered. I also know that my data volume will continue to grow. I already have data (DV video and other media files) that I don’t currently back up that I should. With my old process I couldn’t grow the data volume too much before backups would have started to take all night long or longer. I needed to stop copying duplicate data as much as possible.

Lastly, I had no solution for offsite backups nor did I have a solution for archiving data that doesn’t change much. I have DVD burners and I even have a 70GB DAT tape drive but they were not integrated into my backup process. By using NT Backup on each machine I was left with a collection of large monolithic backup files that would not fit on backup media without file splitting nor could they be sent to an Internet backup service in a reasonable amount of time .

Mirror mirror

From examining my current situation and researching possible new solutions, it was clear to me that there are now better ways to backup data than the traditional methods. What I really wanted was a mirror backup. However since a mirror backup is a complete backup of everything frozen at a point in time, a simple file copy scheme is a very inefficient use of storage space when keeping history. They do however make it very easy to upload incremental changes to offline storage as you can easily detect and upload just the changed files from the last backup set. What is needed is an intelligent mirror backup that conserves space when storing history. One way to do this is by eliminating the physical storage of duplicate files. The solution I ultimately settled on does this by leveraging features of the NTFS file system to both eliminate duplicate file storage and yet still make it possible to browse a complete mirror backup with the Windows Explorer.

 

In part two I'll cover the intelligent mirror method I chose to be the foundation of my new backup strategy, rsync, and why it didn't work for me.


Comments (9) -

manish
manish
11/16/2006 3:28:08 AM #

Must be keep on
but how to take intellegent backup for windows via automatically backup

Thanks

Jeff
Jeff
10/30/2007 4:36:14 PM #

Another good suggestion for intelligent backup solutions is http://www.scriptlogic.com/products/securecopy">secure copy. It has some features that I was unable to find in other tools. When you backing up server data - it keeps all security, permissions, shares, share permissions and compression settings intact. The process of data copying is pretty fast because of multithreaded technology. Some options like security override on access denied or copying only changed files are really irreplaceble.

miky
miky
6/23/2008 2:13:20 PM #

I found this Guide to online backup on Wikipedia! I thought it was extremely helpful so I put it here to share! (http://memopal.clickmeter.com/891931.html)! I just discovered online backup and I think it’s a good way to protect data! Can anyone confirm this???

Viru
Viru
9/18/2008 11:57:14 AM #

I have a query for you. Does snapshot created using Microsoft VSS holds information about incremental or differential backup. What I mean is, if I want to take a differential backup, all I need to do is take a snapshot with setbackupstate(Differential). Or do I need to do something more?

Thanks

David Jade
David Jade
9/18/2008 8:05:27 PM #

It is my understanding (and experience) that a snapshot exposed through VSS will always contain all of the filea on the volume. It is the responsibility of the process using VSS to determine which files in that snapshot need to be backup up.

That said, the differences between an incremental and differential backup type in VSS are a bit confusing to me. From reading the docs it seems to imply that the VSS writers will do something different in the two cases. What, I am not sure. It sure would be nice if they could set to original file's archive attributes but I don't think that they do.

david

Blues
Blues
4/2/2009 4:02:52 AM #

When you run a FULL backup, it backs up every file, regardless of the state of the archive bit, and it resets all archive bits.
When you run an ICREMENTAL backup, it looks for the archive bit, backs up the file when it finds one, and resets the bit.
A DIFFERENTIAL backup looks for the archive bit, and backs up the file when it finds one, but does not reset the bit.

David Jade
David Jade
4/4/2009 1:35:39 AM #

While your definition of full/inc/diff backups it is 100% correct for a backup process, the Windows VSS subsystem also has a definition for differential snapshots that don't have this same meaning. I believe that in this context it pertains to how the VSS snapshot data is stored on the volume and not how any process that is using VSS snapshots manages its physical backup process.

Gary M. Mugford
Gary M. Mugford
7/31/2009 4:55:16 PM #

Have you had a look at Drive Snapshot http://www.drivesnapshot.de/en/ >

We've been using at work and it handles our multiple servers quite well. It can be command-line driven, thus open to batch files and is the fastest backup we've ever seen. Backs up complete copies to rotating folders on rotating computers' external USB drives. At any time, we have a fifteen-day backlog. Once a week, we swap in a sixth external drive and take the one it's sidelining off site. It comes back a week later for the next in line to go off-site. Total cost in external drives, a bit less than $800. But we have quick access to particular files and a feeling that we are decently protected against the ravages of chance.

HitMe WithIt
HitMe WithIt
8/3/2009 12:22:32 AM #

I came here following a link on shellcity.net saying this was a freeware program to do back up. Apparently they are mistaken.
I hope you can release it to the public sometime. Sounds interesting.
Cheers  Smile

Comments are closed

Flux and Mutability

The mutable notebook of David Jade