Test Setup - Software

With the variety of disk drive benchmarks available, we needed a means of comparing the true performance of the hard drives in real world applications. We will continue to utilize HD Tach, PCMark05, Disk Bench, IOMeter (enterprise drive comparisons), and our internal timing program for comparative benchmarks; however, we will be also be adding some new tests. Our logical choice for application benchmarking is the Intel IPEAK Storage Performance Toolkit version 3. We originally started using this storage benchmark application in our Q2 2004 Desktop Hard Drive Comparison: WD Raptor vs. the World. The IPEAK test can be used to measure "pure" hard disk performance, and in this case, we kept the host adaptor as consistent as possible while varying the hard drive models. The idea is to measure the performance of a hard drive with a consistent platform.

We utilize the IPEAK WinTrace32 program to record precise I/O operations when running real world benchmarks. WinTrace32 will only record the accesses it makes to the operating system host adapter driver. We then utilize the IPEAK AnalyzeTrace program to review the disk trace file for integrity and ensure our trace files have properly captured the disk activities we are reporting in our benchmarks. Intel's RankDisk utility is then used to play back the workload of all I/O operations that took place during the recording. RankDisk plays back every request exactly as generated in the WinTrace32 capture file.

RankDisk then generates results in a mean service time in milliseconds format; in other words, the average time that each drive took to fulfill each I/O operation. In order to make the data more understandable, we report the scores as an average number of I/O operations per second so that higher scores translate into better performance in all of our IPEAK results. While these measurements will provide a score representing "pure" hard drive performance, the actual impact of a drive's performance in real world applications can and will be different based upon other system level components. However, the Intel IPEAK tool set does generate an extremely accurate capture and playback of any given workload in an application. Based on this testing methodology our results are an actual representation of the drive's performance within the application.

Our IPEAK tests represent a fairly extensive cross section of applications and usage patterns for both the general and enthusiast user. We will continually tailor these benchmarks with an eye towards the drive's intended usage and feature set when compared to similar drives. In essence, although we will report results from our test suite for all drives, it is important to realize a drive designed for PVR duty will generate significantly different scores in our gaming benchmarks than a drive designed with this purpose in mind such as the WD Raptor. This does not necessarily make the PVR drive a bad choice for those who capture and manipulate video while also gaming. Hopefully our comments in the results sections will offer proper guidance for making a purchasing decision in these situations.

The drive is formatted before each test run and 3 tests are completed on each drive in order to ensure consistency in the benchmark results. The high and low scores are removed with the remaining score representing our reported result. We utilize the NVIDIA nF4 SATA ports along with the NVIDIA IDE-SW driver to ensure consistency in our playback results when utilizing NCQ, TCQ, or RAID settings.

Our IPEAK Test Suite consists of the following benchmarks:

VeriTest Business Winstone 2004: Trace file of the entire test suite that includes applications such as Microsoft Office XP, WinZip 8.1, and Norton Antivirus 2003.

VeriTest Multimedia Content Creation 2004: Trace file of the entire test suite that includes applications such as Adobe Photoshop 7.01, Macromedia Director MX 9.0, Microsoft Windows Media Encoder 9.0, Newtek Lightwave 3D 7.5b, and others.

AVG Antivirus 7.1.392: Trace file of a complete antivirus scan on our test bed hard drive.

Microsoft Disk Defragmenter: Trace file of the complete defragmentation process after the operating system and all applications were installed on our test bed hard drive.

WinRAR 3.51: Trace file of creating a single compressed folder consisting of 444 files in 10 different folders totaling 602MB. The test is split into the time it takes to compress the files and the time it takes to decompress the files.

File Transfer: Individual trace files of transferring the Office Space DVD files to our source drive and transferring the files back to our test drive. The content being transferred consists of 29 files with a data content of 7.55GB.

AnyDVD 5.9.6: Trace file of the time it takes to "rip" the Office Space DVD. We first copy the entire DVD over to our Seagate 7200.10 750GB source drive, defragment this drive, and then measure the time it takes for AnyDVD to "rip" the contents to our test drive. While this is not ideal, it does remove the optical drive as a potential bottleneck during the extraction process and allows us to track the write performance of the drive.

Nero Recode 2: Trace file of the time it takes to shrink the entire Office Space DVD that was extracted in the AnyDVD process into a single 4.5GB DVD image.

Video Streaming: Trace file of the time it takes to capture and record Chapter 11 of Office Space with our NVIDIA DualTV MCE tuner card while viewing Chapter 10 utilizing PowerDVD 6. Chapter 10 has already been recorded and is playing from our source drive while Chapter 11 is being streamed from our Media Server.

Audio Encoding/Video Capture: Trace file of the time it takes Nero Digital Audio to extract all 16 tracks from INXS Greatest Hits CD and convert them into an mp4 format while capturing and recording Chapter 11 of Office Space with our NVIDIA tuner card. We changed the Nero default quality settings to transcoder-ultra, variable bit rate, encoder quality to high, and the AAC profile to LC.

Game Installation: Individual trace files of the time it takes to install Oblivion, Sims 2, and Battlefield 2. We copy each DVD to our secondary Seagate 750GB drive, defragment the drive, and then install each game to our source drive.

Game Play: Individual trace files that capture the startup and about 15 minutes of game play in each game. Our Oblivion trace file consists of visiting 16 different areas within the game, interacting with individual characters, and passing through three different Oblivion gates. The Sims 2 trace file consists of the time it takes to select a pre-configured character; setup a university, downtown, and shopping district from each expansion pack (pre-loaded); and then visit each section before returning home. Our final trace file utilizes Battlefield 2 and we play the Daqing Oilfield map in both single and multiplayer mode.

Feature Set and Options Test Setup - Hardware
Comments Locked

44 Comments

View All Comments

  • Gary Key - Thursday, May 18, 2006 - link

    quote:

    Then what was different than what Seagate claimed?
    Seagates claims are correct from an objecitve measurement, subjectively the drive was louder in our testing at full load with either read or write seeks. I added the subjective statement in this paragraph to convey what I was explaining further in the article. Thanks!! :)
  • Questar - Thursday, May 18, 2006 - link

    Gary, I hate to nick-pick, but even the revised version doesn't read well. You start off with Seagate's claim that the .10 is quieter than the .9, you say you found something different, and then talk about the .10 compared to the other drives.

    You need to say the drive is subjectivly louder than the .9 (if it was).
  • Gary Key - Thursday, May 18, 2006 - link

    quote:

    Gary, I hate to nick-pick, but even the revised version doesn't read well. You start off with Seagate's claim that the .10 is quieter than the .9, you say you found something different, and then talk about the .10 compared to the other drives.


    Sorry about that, I had the WD 500GB statement in the sentence and not the Seagate 500GB, that was confusing, read it so many times that I missed it. It should read better now. :)
  • Zoomer - Friday, May 26, 2006 - link

    Why don't you invite more people down to down some blind comparative tests?

    That would sort out some subjectivity. :)
  • ROcHE - Thursday, May 18, 2006 - link

    Will you guys review more standard sizes? Like 320GB or so.

    I have seen the 750GB model reviewed only so far.
  • Gary Key - Thursday, May 18, 2006 - link

    quote:

    Will you guys review more standard sizes? Like 320GB or so.


    We will in June, Seagate will be shipping press samples out later this month. I want to see the 200GB~320GB drive range just as much as everyone else. ;-)
  • ROcHE - Thursday, May 18, 2006 - link

    It's already up for sale. ???

    Buy one and be the first to review :)
  • Gary Key - Thursday, May 18, 2006 - link

    quote:

    Buy one and be the first to review :)


    I already bought the the additional 750GB, WD1500, and WD5000YS for RAID results. I do not know how much more the wife will let me spend this month. ;-) Anyway, Seagate is getting ready to ship two of the 320s out to us. Hopefully, I can get the review in before Computex. I am pretty much convinced this is the drive that will define the sweet spot in the market for performance, capacity, and price.
  • Zoomer - Friday, May 26, 2006 - link

    From the spec sheet, the 400GB one seems promising to be a contender. It has a higher head to platter ratio. :)
  • ROcHE - Friday, May 19, 2006 - link

    Can't wait to see the results.

Log in

Don't have an account? Sign up now