We are doing our DB2 backup on tapes using GDG datasets. There is lot of work involved with it because tablespaces are getting bigger and they can't fit on the same tape, so we need to separate them to different tapes. How can we backup our data more efficiently?
Although you do not specify what release of DB2 you are using, here is some advice. DB2 V7 provides for more parallelism in the COPY and RECOVERY functions when using tapes. With V6 it's more difficult.
You can manually break up the copy jobs into multiple jobs and manage to get "parallel" that way but the jobs would all have different copy times. For a large application trying to get a single "quiesce" point for recovery, you would have to run a single quiesce job after all the copy jobs are complete. Recover would have to be "to quiesce", not "to copy."
An upside to this is the copies could all be SHRLEVEL CHANGE -- the only outage would be for the quiesce. The downside is in a busy shop you may not be able to get the application-wide quiesce.
And of course, there's always the option of looking at an ISV product to simplify your copying. For example, BMC Software has been providing parallel processing, wildcarding, automatic GDG build for new objects, application group wide quiesce and other functional improvements to the copy and recover process for years.
Check out the following links for information on our COPY PLUS and RECOVER PLUS offerings:
COPY PLUS -->
RECOVER PLUS -->
Thanks to Rick Weaver, Product Manager for BMC Software's Mainframe Database Recovery products for his assistance in putting this response together.)
Editor's note: Do you agree with this expert's response? If you have more to share, post it in one of our