Microsoft's Data Protection Manager (DPM) 2006 is software that optimizes disk-based backup and recovery for Windows...
Server 2003 and related products. It reduces the number of backup windows and provides instant file recovery and other sophisticated data protection services, but, in so doing, it uses resources and imposes performance penalties on protected servers.
DPM uses a combination of logging, replication and snapshots to keep track of protected data and to make copies of it at user-defined intervals. Whenever data in a protected file is changed, the changes are logged on the protected server. That information is forwarded to the DPM server over the network on a regular schedule and stored on the DPM. The data is used to synchronize the replica of the protected file on the DPM server and, at regular intervals, a snapshot of the replica is made using Microsoft's Visual SourceSafe services.
Obviously, this process imposes a load on system resources, including the protected server (logging), the network (synchronization) and, of course, the DPM server and associated storage. But how much of a load?
The Microsoft System Center Data Protection Manager Planning and Deployment Guide describes several techniques that can help you estimate these impacts.
The first step in calculating the load is to estimate how quickly the protected data changes each day. For a quick estimate, look at the size of your average daily incremental backup. This method is fast, but it isn't entirely accurate because of the way DPM works.
You can get a more accurate estimate by looking at the characteristics of the data you are protecting. Because DPM works at the byte level rather than the file level, the actual rate of data change may be a good deal lower than your incremental backup suggests. On the other hand, if you have a lot of small files that get overwritten during the day, your rate of change may be higher because each one of those changes will be captured by the DPM log on the application server.
Microsoft recommends assuming a data change rate 1.5 to 2 times the rate suggested by the daily incremental backup.
Rick Cook has been writing about mass storage since the days when the term meant an 80K floppy disk. The computers he learned on used ferrite cores and magnetic drums. For the last 20 years he has been a freelance writer specializing in issues related to storage and storage management.
More information from SearchDataManagement.com