Business Application Performance Explained

It has been a little over a month since we started using Business Winstone 2001 as a standard benchmark in our CPU reviews. Although we gave an overview of our history with the Ziff Davis Media Winstone benchmarks in our Celeron 766 review, we thought that some more explanation of these benchmarks would aid in understanding the scores that are spit out. Therefore, before we get to the actual Business Winstone 2001 numbers, let's see what this benchmark measures and how.

Business Winstone 2001 keeps in track with previous Business Winstone releases as it measures system speed by seeing how fast the system can finish common business tasks. This time around, however, both the applications have changed to reflect current productivity tools and the scoring method has become more accurate.

Long gone in Business Winstone 2001 are outdated applications and utilities. The results of a benchmark performed on an obsolete application provide little or no value for many, as system performance of updated applications are what we are concerned with. With this in mind, it was necessary for Ziff Davis Media to alter the aging Business Winstone 2000 which included older applications such as Microsoft Office 97.

The updated version of Business Winstone, the 2001 version, stresses a system by running the following applications and performing standard intensive tasks on them. The applications included in Business Winstone 2001 are Microsoft Access 2000, Microsoft Excel 2000, Microsoft FrontPage 2000, Microsoft PowerPoint 2000, Microsoft Word 2000, Microsoft Project 98, Lotus Notes R5, NicoMak Winzip, Norton AntiVirus, and Netscape Communicator 4.7. Many of the tasks are performed with other applications open, simulating actual computer use. Some of the routine tasks performed include scanning a temporary directory for viruses, updating database views in Lotus Notes, and compressing various files.

As far as the reporting side goes, we are quite pleased with the way the new Business Winstone 2001 arrives at the final benchmark score. First off, erroneous scores are now eliminated due to the fact that Business Winstone 2001 runs the benchmark five times to produce the overall score. The first run is always thrown out, and the highest score of the four remaining runs is recorded as the system score. Between each run, the system is automatically rebooted and defragmented This method almost always produces scores that are within 3% or less of each other, leaving very little room for variation.

Speaking of scores, how does Business Winstone 2001 figure out what score to give a machine? This is done using a rather simple formula which divides the amount of time required to complete the test by the time required by the "base" machine to complete the test (180,000 milliseconds) and then multiplies that number by 10 (divides the number into ten). Written out, the formula looks like "10 / (total milliseconds from the test PC / total base machine milliseconds)." Therefore, the number given is a representation of how many times faster (or slower) the test system is, times 10. For example, a score of 20 would mean that the system tested performed 2 times faster than the base system.

Now that we understand Business Winstone 2001 a bit more, and know what those numbers it puts out mean, let's see how our current test system based on the Cyrix III with a Samuel II core performed compared to some similar systems.

The Test Business Application Performance - Windows 2000
Comments Locked

0 Comments

View All Comments

Log in

Don't have an account? Sign up now