Firewire and USB Performance

After looking at many options for Firewire and USB testing, we finally determined that an external USB 2.0, Firewire 400, and Firewire 800 hard disk might be a sensible way to look at USB and Firewire throughput.

Our first efforts at testing with an IDE or SATA drive as the "server" yielded very inconsistent results, since Windows XP sets up cache schemes to improve performance. Finally, we decided to try a RAM disk as our "server", since memory removed almost all overhead from the serving end. We also managed to turn off disk caching on the USB and Firewire side by setting up the drives for "quick disconnect" and our results were consistent over many test runs.

We used 2GB of fast 3-2-2-4 system memory set up as a 450MB RAM disk and 1550MB of system memory. Our standard file is the SPECviewPerf install file, which is 432,533,504 bytes (412.4961MB). After copying this file to our RAM disk, we measured the time for writing from the RAM disk to our external USB 2.0 or Firewire 400 or Firewire 800 drive using a Windows timing program written for AnandTech by our own Jason Clark. The copy times in seconds were then converted into Megabits per second (Mb) to provide a convenient means of comparing throughput. Higher Rates, therefore, mean better performance.

Firewire and USB Performance

Possibly the most striking finding in our Firewire and USB throughput tests is the performance of an external hard drive connected to Firewire 800. If you wonder why Firewire 800 matters, just look at the data. Our benchmarks show Firewire 800 is up to 46% faster than a drive connected to the more common Firewire 400, and about 29% faster than USB 2.0.

Our test is just one of many throughput tests, but in this benchmark, it is clear that the VIA Firewire 400 chip is faster than TI's 1394a chip. The NVIDIA nForce4 USB 2.0 controller is slightly faster than Intel's solution.

Disk Controller Performance Ethernet Performance
Comments Locked

19 Comments

View All Comments

  • johnsonx - Friday, September 23, 2005 - link

    Gary, Wesley, et al:

    I'm just curious if you have any more information on the issue with the NForce4 chipset not supporting the 820, but working fine with the 830, 840 and EE. Why is this the case? There isn't any difference between the 820, 830 and 840 except for clock speed, so it really doesn't make any sense that the 820 doesn't work. Have you gotten any insight from nVidia as to the mechanism of the failure?
  • Bingo13 - Friday, September 23, 2005 - link

    Johnsonx,

    This is an excellent question. I do not have an official technical response from NVIDIA at this time but I will post an educated guess on my part.

    We know NVIDIA did not have the Pentium D 820 available in time during the design and tape out of the nForce4 SLI Intel Edition chipset to ensure proper validation testing and certification of this CPU. They were also faced with a marketing decision in regards to supporting a "budget" level dual-core CPU on a very high-end chipset that was mainly targeting the single-core CPU market at launch time. NVIDIA thought the typical dual-core user buying this type of board would certainly want the 830 at a minimum and certainly the 840 on this board. My opinion is they underestimated the Intel enthusiast crowd that flocked to the 820 based upon its price /performance features and ability to overclock easily past the 840 stock speeds for 45% of the price.

    From a technical viewpoint there is a large difference between the Intel Pentium D 820 and the Intel Pentium D 830/840 series design specifications that I believe is what caused NVIDIA grief with the short product launch window.

    The Intel Pentium D 830 and Intel Pentium D 840 processors incorporate Intel's EIST (Enhanced Intel SpeedStep Technology) that is also included on the 6xx line of processors that also start at 3 GHz like the 830. EIST basically senses when the cores are being under utilized and will dynamically reduce the CPU multiplier (minimum is 14x) to slow down the processor in order to reduce both energy consumption and thermal conditions.

    EIST is an integral part of Intel's TM2 (Thermal Monitor 2, nForce4 Intel does not support but will in future releases) and C1E advanced halt state technologies which constantly monitor and will dynamically reduce the processor's speed in overheating and idle conditions respectively. A speed of 2.8GHz (14 x 200) is the lowest state to which EIST will reduce the Pentium D or Pentium 4 6xx processor speed. This explains why the Intel Pentium D 820 processor (already at 2.8GHz) does not utilize this technology and does not have TM2 or C1E support.

    It is my opinion that NVIDIA was only able to get the Intel Pentium D 830/840 series to work properly (at stock multipliers) due to the initial support for the Pentium 4 6xx series processors in the chipset. Due to the timing and quick release of the Pentium D series I believe NVIDIA worked feverishly to ensure basic EIST functionality in the nForce4 SLI Intel chipset for the Pentium D 830/840 series but did not have time to design/validate the unique code requirements for the 820 due to its differing technology. I think this explains the reason why the 830/840 series will go to single-core mode when the multiplier changes as its EIST support is limited for the Pentium D series. We have to remember that the nForce4 SLI Intel chipset was ready and basically launching before the Pentium D came to market. I know NVIDIA wanted full support but just ran out time in my opinion. They have assured us repeatedly that their next chipset revision will have full support and since they have had some time to work with the Pentium D, I believe them (reserve future rights to deny this statement). ;-)

    Thank you,
    Gary
  • johnsonx - Friday, September 23, 2005 - link

    Thanks for those responses Gary and Jarred. I'm inclined to agree with both of you - it's something in the mix of the 820 not having EIST and already being at 14x multiplier. After all, you already found that running an 830 or 840 at 14x results in single-core only mode, just like an 820 running at the stock multiplier of 14x. It may also be, as you imply Gary, that nVidia was more or less just lucky that they're able to support the Pentium D at all, by work done to support 600 series CPUs. They could easily have had ALL dual-core CPUs running in single core mode.

    I think though that nVidia's claim that they thought enthusiasts buying such a high-end board wouldn't bother with a low-end 820 CPU is just hogwash. That's after-the-fact cover-up, period.



  • Gary Key - Sunday, September 25, 2005 - link

    quote:

    I think though that nVidia's claim that they thought enthusiasts buying such a high-end board wouldn't bother with a low-end 820 CPU is just hogwash. That's after-the-fact cover-up, period.


    Johnsonx,

    Your opinion is also widely shared by several Intel enthusiasts who emailed me. As I stated in the article, we believe the lack of full support for the Pentium D 820 was a mistake by NVIDIA and they certainly plan on correcting it and the multiplier issue in the next chipset revision.
    This is one of the main reasons (along with the multiplier overclocking issue) we thought the Gigabyte board was a better choice. I just wanted to reiterate that MSI did an incredible job with the P4N Diamond (the just released 1.4 bios really makes this a sterling single-core Intel based gaming board) but as a dual-core user unless you do not plan on multiplier overclocking (these core speeds need it) and will only utilize the Pentium D 830/840 then going with an Intel 945/955 chipset is "currently" the best solution.

    Sincerely,
    Gary

  • JarredWalton - Friday, September 23, 2005 - link

    I've wondered the same thing. My only guess is that the 820 doesn't support EIST - since it's already at the minimum 14X multiplier. Perhaps there's some glitch in the chipset with EIST, and that also shows up when running an 820?

    NVIDIA's statement that they didn't expect anyone to use a "budget" processor in such an expensive mobo is a bit fishy. The claim that the 820 wasn't initially planned for production http://www.anandtech.com/cpuchipsets/showdoc.aspx?...">is even worse, in my view - we had information on that chip dating back to before December 04. If the nF4-SLI for Intel was complete already, it should have at least been revised.
  • AdamK47 3DS - Friday, September 23, 2005 - link

    Pretty good article. I was a little disappointed that AMDs name wasn't mentioned one hundred times in this Anandtech review.
  • rjm55 - Friday, September 23, 2005 - link

    You sound like a bitter Intel buyer who is upset no one is patting you on the back for making a wise purchase. As someone here said, there are lots of good reasongs to buy Intel, but Performance hasn't been one of them for over a year.

    Things will change, and when Intel is the Performance Leader again, or things are at least even - maybe late next year - then you can complain that Intel is mentioned 100 times in every AnandTech Review.
  • mikecel79 - Thursday, September 22, 2005 - link

    I glanced it briefly in the morning but now it's gone from the front page. All the links in my history point to the search page now. For some reason the comments are still here.
  • RandomFool - Friday, September 23, 2005 - link

    Oh good I thought I was losing my mind.
  • Wesley Fink - Friday, September 23, 2005 - link

    This article was put on hold while we were waiting for a reply from nVidia on the Dual-Core problems we found. Unfortunately our posting engine had a mind of its own and the article posted by mistake. We pulled the article as soon as we realized what had happenned.

    NVIDIA emailed and talked with us several times yesterday and today, and we learned the dual-core issues Gary found were real and known to nVidia. Once that was clear the article was revised and reposted. We apologize for the confusion of the yo-yo posting, but we were trying to get the full story for our readers before posting the article.

    Please welcome Gary Key as the newest reviewer at AnandTech. Gary did an incredible job in finding and quantifying this nVidia dual-core issue with their Intel chipset switching to single-core mode in overclocking. nVidia assures us this will be fixed in future Intel chipset releases.

Log in

Don't have an account? Sign up now