The shortcoming of this scheme is that you may need to go back further in time to restore data that was erased or damaged without anyone immediately noticing.. In the standard GFS scheme
Trang 1172 Networking: A Beginner’s Guide
TIP If your data is extremely critical and not easily reconstructed, you can often perform full backups every night and also squeeze in a quick incremental backup midday This way, you can’t lose more than a half day’s worth of data
You can also choose rotation schemes that are simpler than GFS For instance, you may use just two or three tapes and then rotate them in sequence, overwriting the old data each time you do so This lets you restore any of the previous three days’ data The shortcoming of this scheme is that you may need to go back further in time to restore data that was erased or damaged without anyone immediately noticing You can combat this problem by using several tapes that you rotate weekly or monthly
One factor to keep in mind when considering different tape rotation schemes is
the granularity of your backups Generally, granularity refers to the flexibility that
you retain to recover data from earlier tapes In the standard GFS scheme, where full backups are made all the time, you could restore a file from any given day for a week’s time, for any given end of the week (Friday) for a month’s time, or for any given month for a year’s time You could not, however, restore a file that was created three months ago in the middle of the month and erased (or damaged) before the month was over,
because a clean copy wouldn’t exist on any of the backup tapes
The best advice for choosing a rotation scheme for important data is that unless there are reasons to do otherwise (as already discussed), use the GFS scheme with full backups This maximizes the safety of your data, maximizes your restoration flexibility, and minimizes the risk of media failure If other factors force you to choose a different scheme, use the discussions in this chapter to arrive at the best compromise for your situation
Granularity and Data Corruption: A Tricky Balance
One reason to consider granularity carefully is the possibility of data becoming corrupted and the situation not being noticed For instance, I once worked with
a database file that had been corrupted several weeks earlier, but had continued
to function and seemed normal After problems started to develop, however,
the database vendor’s technical support staff discovered that a portion of the
database that wasn’t regularly used had become lost and wasn’t repairable The problem was caused by a bad sector on the database’s hard disk The only way that the support people could recover the database and ensure that it was clean was to restore backups, going further and further back in time, until they found
a copy of the database that didn’t have the damage They then reentered the
data that had been added since the nondamaged copy was made Because of the increasing time span between backups as the support people dug further and
further back in time, the amount of data that we needed to reenter grew rapidly
Trang 2173 Chapter 12: Network Disaster Recovery
Chapter Summary
You can be the most proficient person at networking in the world, but if you don’t create
and carefully manage an appropriate disaster recovery program for your company, you’re
not doing your job The importance of this area cannot be overstated In addition to this
chapter, you should also study material covering specific backup and restore instructions
for your network operating systems and databases, as well as the documentation for your
backup hardware device and the backup software you select
The next chapter discusses key information that you should know about selecting,
installing, and managing servers Servers are the heart of any network, and selecting
reliable, productive servers not only will eliminate potential trouble spots for your
network, but can also help you avoid needing to actually use the disaster recovery
plans and strategies that you have put in place
Trang 3This page intentionally left blank
Trang 4Chapter 13
Network Servers:
Everything You
Wanted to Know but
Were Afraid to Ask
Trang 5176 Networking: A Beginner’s Guide
Many different types of servers exist: file and print servers, application servers,
web servers, communications servers, and more What all servers have
in common, though, is that multiple people rely on them and they are usually integral to some sort of network service Because servers are used by tens or hundreds (or thousands!) of people, the computers you use for servers need to be a cut—or two—above just any old workstation Servers need to be much more reliable and serviceable than workstations Plus, they need to perform in different ways from workstations
This chapter covers network server hardware You learn about what distinguishes
a server from a workstation, about different server hardware configurations, and about preparing a server for use in your network
What Distinguishes a Server from a Workstation?
With high-performance desktop computers selling for $1,500 to $3,000, it can be hard
to see how a computer with the same processor can cost in excess of $7,000 just because it’s designed as a “server.” Server computers truly are different from workstations, however, and they incorporate a number of features not found in desktop computers These features are important to a server’s job, which is to serve up data or services to a large number of users as reliably as possible
Server Processors
Much of the performance of a server derives from its central processing unit, or CPU
While servers are also sensitive to the performance of other components (more so than
a desktop computer), the processor is still important in determining how fast the server can operate
Servers can run using one processor or many processors How many processors you choose for a server depends on various factors The first is the network operating system (NOS) you use You need to carefully research how many processors are supported on your proposed NOS if you wish to use multiprocessing
If you plan to use one of the Windows family of servers, you can use multiple processors, depending on which version and edition you plan to run Windows 2000 Server can handle up to 4 processors, while Windows 2000 Advanced Server can handle up to 8 processors, and Windows 2000 Datacenter Server can handle up to
32 processors For Windows Server 2003, both the Standard and Web editions support
up to 2 processors, Enterprise edition supports up to 8, and Datacenter edition supports
up to 32 (and up to 128 processors for the 64-bit variant) For Windows Server 2008, both the Standard and Web editions support up to 4 processors, while the Enterprise edition supports 8 processors, and the Datacenter edition supports 32 processors (up to
64 processors for the 64-bit variant)