Test Setup - Hardware

We have changed our test beds to reflect changes in the current marketplace. While we wanted to change to the AMD AM2 platform, the continual product delays forced us into staying with a socket 939 based system. Based upon the continuing proliferation of dual core processors along with future roadmaps from AMD and Intel signifying the end of the single core processor on the desktop in the near future, we have upgraded from our AMD Athlon64 3500+ to an AMD Opteron 170. This change will also allow us to expand our real world multitasking benchmarks in the near future. We will review our test bed requirements once we have an opportunity to thoroughly test the AM2 and Intel Core 2 Duo platforms.

We debated on the memory size for our IPEAK trace file creations and decided to move to 2GB of system memory. A system with a 1GB memory configuration is the predominant standard at this time in the enthusiast community, but 2GB memory setups are fast becoming the future standard. Although a 1GB memory installation allows us to capture and report a higher amount of disk activity in certain applications; we decided to make the switch at this time as the performance difference is minimal when compared to the 1GB trace files.

Standard Test Bed
Playback of iPEAK Trace Files and Test Application Results
Processor: AMD Opteron 170 utilized for all tests.
RAM: 2 x 1GB Corsair 3500LL PRO
Settings: DDR400 at (2.5-3-3-7*c* 1T)
OS Hard Drive: 1 x Maxtor MaXLine III 7L300S0 300GB 7200 RPM SATA (16MB Buffer)
System Platform Drivers: NVIDIA Platform Driver - 6.85
Video Card: 1 x Asus 7600GS (PCI Express) for all tests.
Video Drivers: NVIDIA nForce 84.21 WHQL
Optical Drive: BenQ DW1640
Cooling: Zalman CNPS9500
Power Supply: OCZ GamexStream 700W
Case: Gigabyte 3D Aurora
Operating System(s): Windows XP Professional SP2
Motherboards: MSI K8N Diamond Plus


Standard Test Bed
Creation of iPEAK Trace Files
Processor: AMD Opteron 170 utilized for all tests.
RAM: 2 x 1GB Corsair 3500LL PRO
Settings: DDR400 at (2.5-3-3-7*c* 1T)
OS Hard Drive: 1 x Maxtor MaXLine III 7L300S0 300GB 7200 RPM SATA (16MB Buffer)
System Platform Drivers: ATI Platform Driver - 1.1.0.0
Video Card: 1 x Asus 7600GS (PCI Express) for all tests.
Video Drivers: NVIDIA nForce 84.21 WHQL
Optical Drive: BenQ DW1640
Cooling: Zalman CNPS9500
Power Supply: OCZ GamexStream 700W
Case: Gigabyte 3D Aurora
Operating System(s): Windows XP Professional SP2
Motherboards: ECS KA1 MVP Extreme


We chose the ECS KA1-MVP as the platform for creating our IPEAK trace files. This affords us an updated system with the capability of correctly creating and storing our trace files on a SATA based drive. It also allows us to utilize a modern video card for the game play trace results which are captured with the graphic settings at a typical 1280x1024 resolution.

You may have noticed we did not use the MSI K8N Diamond Plus for both purposes, though the balance of the component choices are essentially the same. We experienced inconsistencies with our trace files on this platform, our ULi M1575 or M1697 boards, and those of any Intel based systems featuring the ICH6 or ICH7 chipsets. The ATI SB450 proved to be the only currently available chipset that produced repeatable results on all platforms when utilizing the IPEAK WinTrace32 program. Note that this is a common issue with IPEAK: once you create trace files that perform consistently, they will work fine on any platform, but creating the trace files requires the use of specific platforms/drives with prior trace files being developed on an Intel board with the ICH5 Southbridge.

Test Setup - Software Features and HD Tach Test
POST A COMMENT

44 Comments

View All Comments

  • JakeBlade - Friday, May 26, 2006 - link

    Interesting that this drive has a MADE IN SINGAPORE label instead of Seagate's usual MADE IN CHINA junk. Reply
  • ElFenix - Friday, May 19, 2006 - link

    no reason to upgrade from my SE16, i see.

    i'd like to see a couple more drives in tests, such as the latest hitachi.
    Reply
  • Gary Key - Friday, May 19, 2006 - link

    quote:

    i'd like to see a couple more drives in tests, such as the latest hitachi.


    The reason we did not include the Hitachi unit is we have the revised 500GB unit arriving shortly and as mentioned in the article we will have a complete 500GB roundup with the new 7200.10 also. It will take some time to build the database with the revised test suite as we also have additional application timer tests coming shortly.

    The performance across most of the recently released mainstream drives is so close now that it comes down to a personal decision on warranty, reliability, thermals/acoustics, and capacity for the most part. However, drives like the Raptor and RE2 series do make things interesting for SATA on the desktop as did this drive for a PVR fanatic. ;-)
    Reply
  • ElFenix - Friday, May 19, 2006 - link

    i'd also like to see audio tests from a little bit further away. 5 mm doesn't give a realistic idea of how loud it will be sitting 3 feet away on the floor. plus, for all i know where you place the microphone is extremely important when at 5 mm. Reply
  • Gary Key - Friday, May 19, 2006 - link

    quote:

    i'd also like to see audio tests from a little bit further away. 5 mm doesn't give a realistic idea of how loud it will be sitting 3 feet away on the floor. plus, for all i know where you place the microphone is extremely important when at 5 mm.


    There is no one good area to measure the acoustics as you never know where the PC will be located, what type of case, fan noise, or ambient sounds are present. I can tell you that a drive that is loud at 3mm~5mm will be loud at three feet with all things being equal. Sound tones are also very subjective, the dull thumping sound the drive has under load might be perfectly acceptable while the higher pitched clicking sound of a Maxtor will be unbearable for some people.

    We place the two mics at different points on the drive to ensure a consistent recording point, we assume most people will utilize a horizontal mounting point with the rear of the drive facing the case front, although we test the drive facing the case side also as this cage design is becoming very popular. The tone of the drive can change dramatically with the addition of rubber washers between the drive and the mount points.

    Thanks for the comments. :)
    Reply
  • jhvtoor - Friday, May 19, 2006 - link


    Temperature measurement using S.M.A.R.T. is not reliable. The sensor and electronics on the harddrive are used, en they are not calibrated.

    I am using the freeware "HDD Health" utility to monitor the SMART information. It reported the drive temperature of my desktop is 12 degrees celcius immediatly after winXP boot, while the room temperature is 19 degrees.... I am not using cooling techniques on this drive. This can only be explained by an inaccurate temperature measurement of this drive.

    I would suggest to use one an independent measurement instument in the future. Attach the sensor in the middle of the cover plate.

    Reply
  • Gary Key - Friday, May 19, 2006 - link

    Hi,

    1. We have found S.M.A.R.T. to be "fairly" accurate along with our capture utility. We know it is not perfect but it allows us a consistent measurement of each drive in testing. In our 7200.10 test ActiveSmart reported a temperature of 26c after boot, room temp was 22c. We put the drive through 15 minutes of light usage, let it idle for 15 minutes, and then report this number as our idle number. All of the drives we have tested have followed the same pattern with a consistent idle reading after this usage, the idle temp will be the same 15 or 30 minutes later. If you stress the drive, you will see the temps rise accordingly and then fall back to the standing idle temp during the cooldown phase.

    2. One drawback is the temperatures are not "real" time, there is a delay built in, this is why on the load test (also idle) we loop PCMark05 several times and then take the reported temperature at the end of the session, generally the high temperature was actually reached in the previous loop.

    3. We have have tried using a sensor, infrared, and other methods with varying results. The problem is each section of the drive will report a different number. When we utilized a sensor on the top plate, the temps varied from drive to drive with the same model being tested. Each supplier uses different materials for their casings so that creates greater variables, it just is not consistent enough to report.
    Reply
  • toattett - Thursday, May 18, 2006 - link

    Apparently,
    If I want a speedy drive, I buy the raptor.
    If I want a super large drive, I buy the new 750GB Seagate.
    If I want good performance and good amount of stoarge, I buy the 500GB WD.
    Reply
  • Missing Ghost - Thursday, May 18, 2006 - link

    The pictures for the noise level are wrong. You put the dbA level as if it was a linear scale. It's not that way, the space between 0dB and 10dB should be smaller than the space between 10dB and 20dB. That way it will show more clearly the difference between the noise levels. It's a logarithmic scale. Reply
  • Gary Key - Thursday, May 18, 2006 - link

    quote:

    The pictures for the noise level are wrong. You put the dbA level as if it was a linear scale. It's not that way, the space between 0dB and 10dB should be smaller than the space between 10dB and 20dB. That way it will show more clearly the difference between the noise levels. It's a logarithmic scale.


    Our current graph engine will not allow us to do this type of scale manipulation. We will probably have to utilize a Microsoft Excel chart in the next article. We agree with you, just not possible with the internal engine at this time although we are working on a new one.
    Reply

Log in

Don't have an account? Sign up now