There isn’t a wizard behind a curtain that will help speed your backups, but there is a formula for successfully cutting backup and recovery time. By following 10 simple tips, backup managers can heal many of their storage headaches. Bottlenecks to successful backups appear throughout the storage landscape. Source disks, small files, the backup server, backup software — these and other areas are where backups can slow to a standstill and impact backup speed. Let’s look at some individual tips for achieving faster backups:
You can maintain data on easily-accessible virtual tapes by using a disk-based virtual tape, dynamic virtual tape or virtual tape library device as the primary target of the backup process. Virtual tapes typically allow users to employ their physical tapes more efficiently. Additionally, when file systems on “ordinary disks” are used for backup over long periods of time, the constant writing, expiring and re-writing results in file system fragmentation that causes performance to deteriorate. Virtual tapes created directly from the logical volume management system do not use a file system, so file system fragmentation is minimized or eliminated.
The faster the data moves from the backup server to the backup storage device, the faster the backup will be. iSCSI (using GigE) and Fibre Channel are ideal connections between backup servers and their storage. Just pop a card in your backup server and configure the system to work with the exclusive connection.
Some companies use the same network for backups as they do for e-mail, file, print and other business applications. Because today’s business users tend to work around the clock, this results in network congestion, and eliminates night-time backup as an option. Backing up over the corporate network impacts everyone’s performance — including users’ e-mail or Internet access. A separate network for backup is relatively inexpensive. You can easily set up a dedicated GigE backup network and install a new network card in application servers, buying up to 10 times as much performance, improving efficiency and reducing elapsed time, while minimizing complaints from late-night workers whose systems were impacted by the backup load over the corporate network.
Many people sequentially back up their servers. They back up server 1, then back up server 2, then 3 and 4, etc. This easily fills a backup window. By using a multi-stream-capable backup-to-disk target, you can back up all of the servers at the same time by simply modifying backup scripts. This will reduce overall backup time as the sequential backups overlap each other. (A little extra time will be added to each job when processing multiple data streams at the same time, but the total time spent backing up will be reduced.)
Today, backup data is most frequently pulled from application servers over the LAN by a backup server and written to a backup target disk or tape. If one or more application servers takes too long to back up, look at installing a media server version of your backup software (backup media servers write directly to tape or disk) on those slow-to-backup application servers. Writing backup data directly from application servers to disk or tape, rather than having it pulled across the LAN, will save valuable time. Users may have to buy an additional software license, but it will save time and money in the long run.
Using the right tools is important in any job. This is especially true in backup. Specialized backup and recovery appliances can stream and process multiple backup data streams simultaneously. This type of appliance also helps with problematic application servers newly equipped with media server software licenses, and when running multiple backup jobs in parallel.
Some software is inevitably faster than others. Users who test backup software right out of the box using its default settings will notice big differences between them. Working with a value-added reseller can help you fine-tune the software that is recommended for your system and environment. Tuning the buffers, caches and block sizes can dramatically affect the system. Remember, if software sits on both an application server and backup server, it is necessary to tune in both places.
It’s surprising how many Pentium II and other old systems are still being used as backup servers. Hand-me-down hardware may be fine for long-term storage at home, but backup and recovery of critical data require the latest technology to keep up with networks and remain efficient. Backup is an intensive process, and older systems can’t pull the data from application servers fast enough. For only a couple of thousand dollars, users can significantly expedite backup and recovery by using a faster, more up-to-date server.
A large number of small files are performance killers if backup is done file-by-file. Smaller packets of data tend to have just as much overhead as larger BLOBs (binary large objects) of data. E-mail, Web graphics or small application files have overhead and meta data that may be just as taxing as large chunks of files. In addition, when smaller files are sent across the network for backup, they may interfere with larger BLOBs. Using image backup software to consolidate small files into BLOBs will make the data transfer more efficient, improving backup performance and reducing backup time.
External RAID is usually best. The speed of a backup can only be as fast as the speed at which the application server’s disk delivers the data. Disk storage under an application server’s covers may not be able to deliver the performance needed when backing up. In addition, external storage, including RAID, can also be a backup bottleneck. To improve backup performance, use fast (usually external) disk storage on application servers. This will have the added benefit of improving application response as well. By taking advantage of these best practices for backups, users will quickly reap the benefits of faster backups, thereby allowing themselves to focus resources on new storage initiatives.
About the author
Robert Farkaly is director of disk-based products at Overland Storage.
By Robert Farkaly
Cyber threats never take a day off, never clock out and go home at the…
Building, deploying, and managing applications via Microsoft's global network of data centers is easier with…
Microsoft Copilot is a tool, powered by AI, that aims to boost your productivity within…
Making things happen is the art and science of project management. The process involves managing…
In today's fast digital life, website performance is important, as it holds visitors and ensures…
The FBI reported that cyber attacks against government facilities saw an increase of almost 36…