In order to maintain optimal system performance and reliability, it is desirable for enterprises to schedule disk defragmentation on a regular basis for all servers and workstations. Therefore, the ability of an enterprise to schedule, control, and monitor defragmentation is extremely relevant to TCO.
This becomes apparent by comparing manual against centrally monitored network defragmentation.
Impact of manual defragmentation
It is both impractical and cost-ineffective for IT support groups to manually run defragmentation box by box across an enterprise. This causes two basic problems:
The time and effort required to manually defragment servers and workstations for defragmenting throughout an enterprise increases TCO proportionately with the size and number of servers and workstations. TCO benefits are realized by centralized defragmentation of even a handful of machines; in mid-sized and large companies, manual defragmentation quickly becomes cost prohibitive.
Due to the labor-intensive nature of manually defragmenting each individual server, it would typically end up being performed in a reactive manner, if done at all. A site would experience slow downs impacting productivity. End users would complain because of poor systems performance, and IT staff would have to run the defragmentation software on specific workstations and servers. Along with lost performance, desktop support calls would increase significantly due to reliability degradation. Thus, a manual process would create such problems that much of the benefits available from defragmentation would be lost.
Cost advantages of network defragmentation
Let’s look at three typical examples of manual versus network defragmentation. The first concerns a single server with 10 workstations; the second consists of 10 servers and 1,000 workstations, and the final example has 25 servers and 5,000 workstations.
In each manual scenario, let’s assume it takes one hour to defragment server and workstation disks, allowing enough time for an IT support person to schedule the activity, move to the location, and perform the task. For the purposes of this example, we will further assume that defragmentation is only performed once a week and that the IT support person is paid $40 per hour.
The real cost of hardware upgrades
Many companies upgrade their hardware approximately every three years. In many cases, however, the performance gains anticipated from hardware upgrades may be realizable through defragmentation of their existing systems.
How much does it cost to improve system performance and reliability through a hardware upgrade or replacement? Unfortunately, a system upgrade/replacement involves more than the cost of the hardware alone. The IT professional’s time must also be considered in the equation, as well as the expense of system unavailability to the user. Using the same three scenarios as before, at an average cost (as of April,
2003) of $1,800 per workstation and four hours of IT-staff time to perform each upgrade, we can estimate the overall cost of the upgrade/replacement. Note: This figure is based on obtaining new equipment rather than attempting to upgrade individual components. Based on PC workstation economics, it is more cost efficient to buy a new one. The older workstation can either be re-deployed or scrapped.
Let’s assume that the original workstations were purchased three years ago for $2,800 and have a typical three-year life cycle. However, due in large part to disk fragmentation, the workstations have steadily deteriorated in performance and reliability. A company then decides it is time to upgrade the workstation after three years. The residual value after three years is estimated at 10 per cent or about $280. This calculates out to a cost of $2,520 for the three years or $840 per year.
At the end of the third year, new notebooks with faster processors, more memory,
and larger disks can also be purchased for about $1,800 due to lower workstation costs. Using a five-year period, in this example, the cost would average out at $756 per year. This $756 figure is based on the $2,520 cost for the first three years for the initial workstation plus the $1,260 cost over two years for the second workstation (using 30 per cent residual value). This totals $3,780 over the five-year period or $756 per year. Yet, even with the upgrade, it becomes just a matter of time before the disk on the newer system also becomes fragmented, producing a performance bottleneck.
Along with actual costs of new hardware, factor in the time it takes to remove an older model and install a newer workstation. Using data from a previous IDC study, it takes on average two and one-half hours to de-install a workstation and another three and one-quarter hours to install the new one. As a result, five and three-quarter hours are absorbed in replacement. Total staff hours are rounded to the nearest hour and the same forty dollars per hour is used for IT staff costs. In these three scenarios, bear in mind that only the workstation and time costs are calculated. Server expenses are not included, though they do have a significant impact on the overall costs.
Conclusion
IDC has shown the value of using an automated network defragmentation tool as compared to maintaining a Windows enterprise either without routine defragmenting or in using a built-in, manual utility. Solving fragmentation will help companies achieve the purpose of achieving gains in user productivity, lowering IT / help desk costs, hardware budget waste, and help increase system uptime.
Frederick W. Broussard is an analyst at IDC Corp. in Framingham, Mass.