Ethernet Performance

As a refresher, the new motherboard test suite includes LAN performance measurements. Both of these boards utilize PCI Express controllers with the only difference being the supplier of the core logic.

The Windows 2000 Driver Development Kit (DDK) includes a useful LAN testing utility called NTttcp. We used the NTttcp tool to test Ethernet throughput and the CPU utilization of the various Ethernet Controllers used on the nForce4 Ultra motherboards.

We set up one machine as the server; in this case, an Intel box with an Intel CSA Gigabit LAN connection. Intel CSA has a reputation for providing fast throughput and this seemed a reasonable choice to serve our Gigabit LAN clients.

At the server side, we used the following Command Line as suggested by the VIA whitepaper on LAN testing:
Ntttcps - m 4,0, -a 4 - l 256000 - n 30000
On the client side (the motherboard under test), we used the following Command Line:
Ntttcpr - m 4,0, -a 4 - l 256000 - n 30000
At the conclusion of the test, we captured the throughput and CPU utilization figures from the client screen.

Ethernet Throughput

Ethernet Throughput

The NVIDIA on-chip PCI Express LAN exhibits slightly higher throughput, but its CPU utilization is slightly more than the Broadcom solution on the Gigabyte board. The Marvell 88E8053 option on the MSI board offers excellent throughput, but at the price of having almost double the CPU utilization of the other solutions.

All Ethernet tests were performed with standard frames and the NVIDIA Active Armor suite disabled. Gigabit Ethernet supports Jumbo frames as well and will theoretically provide a further reduction in CPU overhead. We have seen test results that show the combination of Active Armor and Jumbo Frames have reduced CPU utilization below 10%, which is very respectable performance for on-chip gigabit LAN.

Firewire and USB Performance Audio Performance
Comments Locked

19 Comments

View All Comments

  • johnsonx - Friday, September 23, 2005 - link

    Gary, Wesley, et al:

    I'm just curious if you have any more information on the issue with the NForce4 chipset not supporting the 820, but working fine with the 830, 840 and EE. Why is this the case? There isn't any difference between the 820, 830 and 840 except for clock speed, so it really doesn't make any sense that the 820 doesn't work. Have you gotten any insight from nVidia as to the mechanism of the failure?
  • Bingo13 - Friday, September 23, 2005 - link

    Johnsonx,

    This is an excellent question. I do not have an official technical response from NVIDIA at this time but I will post an educated guess on my part.

    We know NVIDIA did not have the Pentium D 820 available in time during the design and tape out of the nForce4 SLI Intel Edition chipset to ensure proper validation testing and certification of this CPU. They were also faced with a marketing decision in regards to supporting a "budget" level dual-core CPU on a very high-end chipset that was mainly targeting the single-core CPU market at launch time. NVIDIA thought the typical dual-core user buying this type of board would certainly want the 830 at a minimum and certainly the 840 on this board. My opinion is they underestimated the Intel enthusiast crowd that flocked to the 820 based upon its price /performance features and ability to overclock easily past the 840 stock speeds for 45% of the price.

    From a technical viewpoint there is a large difference between the Intel Pentium D 820 and the Intel Pentium D 830/840 series design specifications that I believe is what caused NVIDIA grief with the short product launch window.

    The Intel Pentium D 830 and Intel Pentium D 840 processors incorporate Intel's EIST (Enhanced Intel SpeedStep Technology) that is also included on the 6xx line of processors that also start at 3 GHz like the 830. EIST basically senses when the cores are being under utilized and will dynamically reduce the CPU multiplier (minimum is 14x) to slow down the processor in order to reduce both energy consumption and thermal conditions.

    EIST is an integral part of Intel's TM2 (Thermal Monitor 2, nForce4 Intel does not support but will in future releases) and C1E advanced halt state technologies which constantly monitor and will dynamically reduce the processor's speed in overheating and idle conditions respectively. A speed of 2.8GHz (14 x 200) is the lowest state to which EIST will reduce the Pentium D or Pentium 4 6xx processor speed. This explains why the Intel Pentium D 820 processor (already at 2.8GHz) does not utilize this technology and does not have TM2 or C1E support.

    It is my opinion that NVIDIA was only able to get the Intel Pentium D 830/840 series to work properly (at stock multipliers) due to the initial support for the Pentium 4 6xx series processors in the chipset. Due to the timing and quick release of the Pentium D series I believe NVIDIA worked feverishly to ensure basic EIST functionality in the nForce4 SLI Intel chipset for the Pentium D 830/840 series but did not have time to design/validate the unique code requirements for the 820 due to its differing technology. I think this explains the reason why the 830/840 series will go to single-core mode when the multiplier changes as its EIST support is limited for the Pentium D series. We have to remember that the nForce4 SLI Intel chipset was ready and basically launching before the Pentium D came to market. I know NVIDIA wanted full support but just ran out time in my opinion. They have assured us repeatedly that their next chipset revision will have full support and since they have had some time to work with the Pentium D, I believe them (reserve future rights to deny this statement). ;-)

    Thank you,
    Gary
  • johnsonx - Friday, September 23, 2005 - link

    Thanks for those responses Gary and Jarred. I'm inclined to agree with both of you - it's something in the mix of the 820 not having EIST and already being at 14x multiplier. After all, you already found that running an 830 or 840 at 14x results in single-core only mode, just like an 820 running at the stock multiplier of 14x. It may also be, as you imply Gary, that nVidia was more or less just lucky that they're able to support the Pentium D at all, by work done to support 600 series CPUs. They could easily have had ALL dual-core CPUs running in single core mode.

    I think though that nVidia's claim that they thought enthusiasts buying such a high-end board wouldn't bother with a low-end 820 CPU is just hogwash. That's after-the-fact cover-up, period.



  • Gary Key - Sunday, September 25, 2005 - link

    quote:

    I think though that nVidia's claim that they thought enthusiasts buying such a high-end board wouldn't bother with a low-end 820 CPU is just hogwash. That's after-the-fact cover-up, period.


    Johnsonx,

    Your opinion is also widely shared by several Intel enthusiasts who emailed me. As I stated in the article, we believe the lack of full support for the Pentium D 820 was a mistake by NVIDIA and they certainly plan on correcting it and the multiplier issue in the next chipset revision.
    This is one of the main reasons (along with the multiplier overclocking issue) we thought the Gigabyte board was a better choice. I just wanted to reiterate that MSI did an incredible job with the P4N Diamond (the just released 1.4 bios really makes this a sterling single-core Intel based gaming board) but as a dual-core user unless you do not plan on multiplier overclocking (these core speeds need it) and will only utilize the Pentium D 830/840 then going with an Intel 945/955 chipset is "currently" the best solution.

    Sincerely,
    Gary

  • JarredWalton - Friday, September 23, 2005 - link

    I've wondered the same thing. My only guess is that the 820 doesn't support EIST - since it's already at the minimum 14X multiplier. Perhaps there's some glitch in the chipset with EIST, and that also shows up when running an 820?

    NVIDIA's statement that they didn't expect anyone to use a "budget" processor in such an expensive mobo is a bit fishy. The claim that the 820 wasn't initially planned for production http://www.anandtech.com/cpuchipsets/showdoc.aspx?...">is even worse, in my view - we had information on that chip dating back to before December 04. If the nF4-SLI for Intel was complete already, it should have at least been revised.
  • AdamK47 3DS - Friday, September 23, 2005 - link

    Pretty good article. I was a little disappointed that AMDs name wasn't mentioned one hundred times in this Anandtech review.
  • rjm55 - Friday, September 23, 2005 - link

    You sound like a bitter Intel buyer who is upset no one is patting you on the back for making a wise purchase. As someone here said, there are lots of good reasongs to buy Intel, but Performance hasn't been one of them for over a year.

    Things will change, and when Intel is the Performance Leader again, or things are at least even - maybe late next year - then you can complain that Intel is mentioned 100 times in every AnandTech Review.
  • mikecel79 - Thursday, September 22, 2005 - link

    I glanced it briefly in the morning but now it's gone from the front page. All the links in my history point to the search page now. For some reason the comments are still here.
  • RandomFool - Friday, September 23, 2005 - link

    Oh good I thought I was losing my mind.
  • Wesley Fink - Friday, September 23, 2005 - link

    This article was put on hold while we were waiting for a reply from nVidia on the Dual-Core problems we found. Unfortunately our posting engine had a mind of its own and the article posted by mistake. We pulled the article as soon as we realized what had happenned.

    NVIDIA emailed and talked with us several times yesterday and today, and we learned the dual-core issues Gary found were real and known to nVidia. Once that was clear the article was revised and reposted. We apologize for the confusion of the yo-yo posting, but we were trying to get the full story for our readers before posting the article.

    Please welcome Gary Key as the newest reviewer at AnandTech. Gary did an incredible job in finding and quantifying this nVidia dual-core issue with their Intel chipset switching to single-core mode in overclocking. nVidia assures us this will be fixed in future Intel chipset releases.

Log in

Don't have an account? Sign up now