Test Setup - Software

With the variety of disk drive benchmarks available, we needed a means of comparing the true performance of the hard drives in real world applications. We will continue to utilize HD Tach, PCMark05, Disk Bench, IOMeter (enterprise drive comparisons), and our internal timing program for comparative benchmarks; however, we will be also be adding some new tests. Our logical choice for application benchmarking is the Intel IPEAK Storage Performance Toolkit version 3. We originally started using this storage benchmark application in our Q2 2004 Desktop Hard Drive Comparison: WD Raptor vs. the World. The IPEAK test can be used to measure "pure" hard disk performance, and in this case, we kept the host adaptor as consistent as possible while varying the hard drive models. The idea is to measure the performance of a hard drive with a consistent platform.

We utilize the IPEAK WinTrace32 program to record precise I/O operations when running real world benchmarks. WinTrace32 will only record the accesses it makes to the operating system host adapter driver. We then utilize the IPEAK AnalyzeTrace program to review the disk trace file for integrity and ensure our trace files have properly captured the disk activities we are reporting in our benchmarks. Intel's RankDisk utility is then used to play back the workload of all I/O operations that took place during the recording. RankDisk plays back every request exactly as generated in the WinTrace32 capture file.

RankDisk then generates results in a mean service time in milliseconds format; in other words, the average time that each drive took to fulfill each I/O operation. In order to make the data more understandable, we report the scores as an average number of I/O operations per second so that higher scores translate into better performance in all of our IPEAK results. While these measurements will provide a score representing "pure" hard drive performance, the actual impact of a drive's performance in real world applications can and will be different based upon other system level components. However, the Intel IPEAK tool set does generate an extremely accurate capture and playback of any given workload in an application. Based on this testing methodology our results are an actual representation of the drive's performance within the application.

Our IPEAK tests represent a fairly extensive cross section of applications and usage patterns for both the general and enthusiast user. We will continually tailor these benchmarks with an eye towards the drive's intended usage and feature set when compared to similar drives. In essence, although we will report results from our test suite for all drives, it is important to realize a drive designed for PVR duty will generate significantly different scores in our gaming benchmarks than a drive designed with this purpose in mind such as the WD Raptor. This does not necessarily make the PVR drive a bad choice for those who capture and manipulate video while also gaming. Hopefully our comments in the results sections will offer proper guidance for making a purchasing decision in these situations.

The drive is formatted before each test run and 3 tests are completed on each drive in order to ensure consistency in the benchmark results. The high and low scores are removed with the remaining score representing our reported result. We utilize the NVIDIA nF4 SATA ports along with the NVIDIA IDE-SW driver to ensure consistency in our playback results when utilizing NCQ, TCQ, or RAID settings.

Our IPEAK Test Suite consists of the following benchmarks:

VeriTest Business Winstone 2004: Trace file of the entire test suite that includes applications such as Microsoft Office XP, WinZip 8.1, and Norton Antivirus 2003.

VeriTest Multimedia Content Creation 2004: Trace file of the entire test suite that includes applications such as Adobe Photoshop 7.01, Macromedia Director MX 9.0, Microsoft Windows Media Encoder 9.0, Newtek Lightwave 3D 7.5b, and others.

AVG Antivirus 7.1.392: Trace file of a complete antivirus scan on our test bed hard drive.

Microsoft Disk Defragmenter: Trace file of the complete defragmentation process after the operating system and all applications were installed on our test bed hard drive.

WinRAR 3.51: Trace file of creating a single compressed folder consisting of 444 files in 10 different folders totaling 602MB. The test is split into the time it takes to compress the files and the time it takes to decompress the files.

File Transfer: Individual trace files of transferring the Office Space DVD files to our source drive and transferring the files back to our test drive. The content being transferred consists of 29 files with a data content of 7.55GB.

AnyDVD 5.9.6: Trace file of the time it takes to "rip" the Office Space DVD. We first copy the entire DVD over to our Seagate 7200.10 750GB source drive, defragment this drive, and then measure the time it takes for AnyDVD to "rip" the contents to our test drive. While this is not ideal, it does remove the optical drive as a potential bottleneck during the extraction process and allows us to track the write performance of the drive.

Nero Recode 2: Trace file of the time it takes to shrink the entire Office Space DVD that was extracted in the AnyDVD process into a single 4.5GB DVD image.

Video Streaming: Trace file of the time it takes to capture and record Chapter 11 of Office Space with our NVIDIA DualTV MCE tuner card while viewing Chapter 10 utilizing PowerDVD 6. Chapter 10 has already been recorded and is playing from our source drive while Chapter 11 is being streamed from our Media Server.

Audio Encoding/Video Capture: Trace file of the time it takes Nero Digital Audio to extract all 16 tracks from INXS Greatest Hits CD and convert them into an mp4 format while capturing and recording Chapter 11 of Office Space with our NVIDIA tuner card. We changed the Nero default quality settings to transcoder-ultra, variable bit rate, encoder quality to high, and the AAC profile to LC.

Game Installation: Individual trace files of the time it takes to install Oblivion, Sims 2, and Battlefield 2. We copy each DVD to our secondary Seagate 750GB drive, defragment the drive, and then install each game to our source drive.

Game Play: Individual trace files that capture the startup and about 15 minutes of game play in each game. Our Oblivion trace file consists of visiting 16 different areas within the game, interacting with individual characters, and passing through three different Oblivion gates. The Sims 2 trace file consists of the time it takes to select a pre-configured character; setup a university, downtown, and shopping district from each expansion pack (pre-loaded); and then visit each section before returning home. Our final trace file utilizes Battlefield 2 and we play the Daqing Oilfield map in both single and multiplayer mode.

Feature Set and Options Test Setup - Hardware
POST A COMMENT

44 Comments

View All Comments

  • JakeBlade - Friday, May 26, 2006 - link

    Interesting that this drive has a MADE IN SINGAPORE label instead of Seagate's usual MADE IN CHINA junk. Reply
  • ElFenix - Friday, May 19, 2006 - link

    no reason to upgrade from my SE16, i see.

    i'd like to see a couple more drives in tests, such as the latest hitachi.
    Reply
  • Gary Key - Friday, May 19, 2006 - link

    quote:

    i'd like to see a couple more drives in tests, such as the latest hitachi.


    The reason we did not include the Hitachi unit is we have the revised 500GB unit arriving shortly and as mentioned in the article we will have a complete 500GB roundup with the new 7200.10 also. It will take some time to build the database with the revised test suite as we also have additional application timer tests coming shortly.

    The performance across most of the recently released mainstream drives is so close now that it comes down to a personal decision on warranty, reliability, thermals/acoustics, and capacity for the most part. However, drives like the Raptor and RE2 series do make things interesting for SATA on the desktop as did this drive for a PVR fanatic. ;-)
    Reply
  • ElFenix - Friday, May 19, 2006 - link

    i'd also like to see audio tests from a little bit further away. 5 mm doesn't give a realistic idea of how loud it will be sitting 3 feet away on the floor. plus, for all i know where you place the microphone is extremely important when at 5 mm. Reply
  • Gary Key - Friday, May 19, 2006 - link

    quote:

    i'd also like to see audio tests from a little bit further away. 5 mm doesn't give a realistic idea of how loud it will be sitting 3 feet away on the floor. plus, for all i know where you place the microphone is extremely important when at 5 mm.


    There is no one good area to measure the acoustics as you never know where the PC will be located, what type of case, fan noise, or ambient sounds are present. I can tell you that a drive that is loud at 3mm~5mm will be loud at three feet with all things being equal. Sound tones are also very subjective, the dull thumping sound the drive has under load might be perfectly acceptable while the higher pitched clicking sound of a Maxtor will be unbearable for some people.

    We place the two mics at different points on the drive to ensure a consistent recording point, we assume most people will utilize a horizontal mounting point with the rear of the drive facing the case front, although we test the drive facing the case side also as this cage design is becoming very popular. The tone of the drive can change dramatically with the addition of rubber washers between the drive and the mount points.

    Thanks for the comments. :)
    Reply
  • jhvtoor - Friday, May 19, 2006 - link


    Temperature measurement using S.M.A.R.T. is not reliable. The sensor and electronics on the harddrive are used, en they are not calibrated.

    I am using the freeware "HDD Health" utility to monitor the SMART information. It reported the drive temperature of my desktop is 12 degrees celcius immediatly after winXP boot, while the room temperature is 19 degrees.... I am not using cooling techniques on this drive. This can only be explained by an inaccurate temperature measurement of this drive.

    I would suggest to use one an independent measurement instument in the future. Attach the sensor in the middle of the cover plate.

    Reply
  • Gary Key - Friday, May 19, 2006 - link

    Hi,

    1. We have found S.M.A.R.T. to be "fairly" accurate along with our capture utility. We know it is not perfect but it allows us a consistent measurement of each drive in testing. In our 7200.10 test ActiveSmart reported a temperature of 26c after boot, room temp was 22c. We put the drive through 15 minutes of light usage, let it idle for 15 minutes, and then report this number as our idle number. All of the drives we have tested have followed the same pattern with a consistent idle reading after this usage, the idle temp will be the same 15 or 30 minutes later. If you stress the drive, you will see the temps rise accordingly and then fall back to the standing idle temp during the cooldown phase.

    2. One drawback is the temperatures are not "real" time, there is a delay built in, this is why on the load test (also idle) we loop PCMark05 several times and then take the reported temperature at the end of the session, generally the high temperature was actually reached in the previous loop.

    3. We have have tried using a sensor, infrared, and other methods with varying results. The problem is each section of the drive will report a different number. When we utilized a sensor on the top plate, the temps varied from drive to drive with the same model being tested. Each supplier uses different materials for their casings so that creates greater variables, it just is not consistent enough to report.
    Reply
  • toattett - Thursday, May 18, 2006 - link

    Apparently,
    If I want a speedy drive, I buy the raptor.
    If I want a super large drive, I buy the new 750GB Seagate.
    If I want good performance and good amount of stoarge, I buy the 500GB WD.
    Reply
  • Missing Ghost - Thursday, May 18, 2006 - link

    The pictures for the noise level are wrong. You put the dbA level as if it was a linear scale. It's not that way, the space between 0dB and 10dB should be smaller than the space between 10dB and 20dB. That way it will show more clearly the difference between the noise levels. It's a logarithmic scale. Reply
  • Gary Key - Thursday, May 18, 2006 - link

    quote:

    The pictures for the noise level are wrong. You put the dbA level as if it was a linear scale. It's not that way, the space between 0dB and 10dB should be smaller than the space between 10dB and 20dB. That way it will show more clearly the difference between the noise levels. It's a logarithmic scale.


    Our current graph engine will not allow us to do this type of scale manipulation. We will probably have to utilize a Microsoft Excel chart in the next article. We agree with you, just not possible with the internal engine at this time although we are working on a new one.
    Reply

Log in

Don't have an account? Sign up now