MSI P4N Diamond: Overclocking

Front Side Bus Overclocking Testbed
Processor: Pentium 4 Prescott LGA 775
840EE Dual Core 3.2GHz
CPU Voltage: 1.4V (1.350V default)
Cooling: Intel Stock Cooler
Power Supply: OCZ Power Stream 520
Maximum CPU OverClock 238FSB x1 6 (3808MHz) +19%
Maximum FSB OC: 0FSB x 14 (0MHz)

With a stock multiplier, the P4N Diamond reached an overclock of 238x16 or 3808MHz. Unfortunately, once we changed the multiplier, the system would disable the second core. We tested several BIOS versions from MSI, but the results remained the same. This only reinforced our findings that the nForce4 SLI Intel Edition chipset did not fully support the entire Pentium D processor lineup. To further check, we tried the Intel Pentium D 820, a low-cost dual-core dual-core Intel processor very popular with overclockers. The MSI board would not run the 820 in dual-core mode, switching it to single core operation only.

We sent our test findings to both MSI and NVIDIA for resolution. NVIDIA responded and stated both issues are a known limitation in the current NVIDIA chipset for Intel SLI and will be fixed in future product releases. NVIDIA added that the Intel Pentium D 820 was not released when the NVIDIA Intel SLI package was designed and they did not validate the chipset. They did not believe at the time it was likely buyers of a top-end board with the NVIDIA nforce4 Intel Edition SLI chipset would use it with a value dual-core processor like the 820. However, the Pentium D 820 problem will also be fixed in future NVIDIA chipset revisions.

While we were able to reach these OC levels with the Intel HSF, the long-term stability of the system at these speeds is very suspect. After running several tests, the system began throttling, alternating between stock speed and reduced speed due to overheating. There is no doubt that the CPU can do these speeds or better, but it will require an alternative cooling system.

Memory Stress Testing

Memory stress tests look at the ability of the MSI P4N Diamond to operate at the officially supported memory frequencies of 533MHz and 667MHz DDR2, at the best performing memory timings that the Patriot Extreme Performance PEP21G5600+XBL will support.

MSI P4N Diamond
Stable DDR533 Timings - 2 DIMMs
(2/4 slots populated - 1 Dual-Channel Bank)
Clock Speed: 200MHz (800FSB)
Timing Mode: 533MHz - Default
CAS Latency: 3
RAS to CAS Delay: 2
RAS Precharge: 2
RAS Cycle Time: 4
Voltage: 1.8V
Command Rate: 1

The MSI P4N Diamond was completely stable with 2 DDR2 modules in Dual-Channel at the settings of 3-2-2-4 at 1.8V.

Filling all four available memory slots is more strenuous on the memory subsystem than testing 2 DDR2 modules on a motherboard.

MSI P4N Diamond
Stable DDR533 Timings - 4 DIMMs
(4/4 slots populated - 2 Dual-Channel Banks)
Clock Speed: 200MHz (800FSB)
Timing Mode: 533MHz - Default
CAS Latency: 3
RAS to CAS Delay: 2
RAS Precharge: 2
RAS Cycle Time: 4
Voltage: 1.95V
Command Rate: 1

The MSI P4N Diamond was completely stable with 4 DDR2 modules in Dual-Channel at the settings of 3-2-2-4, but the voltage had to be increased to 1.95V.

We will now increase the memory frequencies to 667MHZ to see what effect this change has on the memory timings and stability of the board.

MSI P4N Diamond
Stable DDR677 Timings - 2 DIMMs
(2/4 slots populated - 1 Dual-Channel Bank)
Clock Speed: 200MHz (800FSB)
Timing Mode: 667MHz - Default
CAS Latency: 4
RAS to CAS Delay: 3
RAS Precharge: 3
RAS Cycle Time: 8
Voltage: 1.9V
Command Rate: 1

The MSI P4N Diamond was completely stable with 2 DDR2 modules in Dual-Channel at the settings of 4-3-3-8 at 1.9V and leaving the Command Rate at 1.

Filling all four available memory slots is more strenuous on the memory subsystem than testing 2 DDR2 modules on a motherboard.

MSI P4N Diamond
Stable DDR667 Timings - 4 DIMMs
(4/4 slots populated - 2 Dual-Channel Banks)
Clock Speed: 200MHz (800FSB)
Timing Mode: 667MHz - Default
CAS Latency: 4
RAS to CAS Delay: 3
RAS Precharge: 3
RAS Cycle Time: 8
Voltage: 1.9V
Command Rate: 2

The MSI P4N Diamond was completely stable with 4 DDR2 modules in Dual-Channel at the settings of 4-3-3-8 at 1.9V, but the Command Rate had to be switched to 2. This resulted in a 1 to 3% decrease in benchmark scores during testing.

MSI P4N Diamond: Features Gigabyte GA-8I955X Royal: Features
Comments Locked

19 Comments

View All Comments

  • johnsonx - Friday, September 23, 2005 - link

    Gary, Wesley, et al:

    I'm just curious if you have any more information on the issue with the NForce4 chipset not supporting the 820, but working fine with the 830, 840 and EE. Why is this the case? There isn't any difference between the 820, 830 and 840 except for clock speed, so it really doesn't make any sense that the 820 doesn't work. Have you gotten any insight from nVidia as to the mechanism of the failure?
  • Bingo13 - Friday, September 23, 2005 - link

    Johnsonx,

    This is an excellent question. I do not have an official technical response from NVIDIA at this time but I will post an educated guess on my part.

    We know NVIDIA did not have the Pentium D 820 available in time during the design and tape out of the nForce4 SLI Intel Edition chipset to ensure proper validation testing and certification of this CPU. They were also faced with a marketing decision in regards to supporting a "budget" level dual-core CPU on a very high-end chipset that was mainly targeting the single-core CPU market at launch time. NVIDIA thought the typical dual-core user buying this type of board would certainly want the 830 at a minimum and certainly the 840 on this board. My opinion is they underestimated the Intel enthusiast crowd that flocked to the 820 based upon its price /performance features and ability to overclock easily past the 840 stock speeds for 45% of the price.

    From a technical viewpoint there is a large difference between the Intel Pentium D 820 and the Intel Pentium D 830/840 series design specifications that I believe is what caused NVIDIA grief with the short product launch window.

    The Intel Pentium D 830 and Intel Pentium D 840 processors incorporate Intel's EIST (Enhanced Intel SpeedStep Technology) that is also included on the 6xx line of processors that also start at 3 GHz like the 830. EIST basically senses when the cores are being under utilized and will dynamically reduce the CPU multiplier (minimum is 14x) to slow down the processor in order to reduce both energy consumption and thermal conditions.

    EIST is an integral part of Intel's TM2 (Thermal Monitor 2, nForce4 Intel does not support but will in future releases) and C1E advanced halt state technologies which constantly monitor and will dynamically reduce the processor's speed in overheating and idle conditions respectively. A speed of 2.8GHz (14 x 200) is the lowest state to which EIST will reduce the Pentium D or Pentium 4 6xx processor speed. This explains why the Intel Pentium D 820 processor (already at 2.8GHz) does not utilize this technology and does not have TM2 or C1E support.

    It is my opinion that NVIDIA was only able to get the Intel Pentium D 830/840 series to work properly (at stock multipliers) due to the initial support for the Pentium 4 6xx series processors in the chipset. Due to the timing and quick release of the Pentium D series I believe NVIDIA worked feverishly to ensure basic EIST functionality in the nForce4 SLI Intel chipset for the Pentium D 830/840 series but did not have time to design/validate the unique code requirements for the 820 due to its differing technology. I think this explains the reason why the 830/840 series will go to single-core mode when the multiplier changes as its EIST support is limited for the Pentium D series. We have to remember that the nForce4 SLI Intel chipset was ready and basically launching before the Pentium D came to market. I know NVIDIA wanted full support but just ran out time in my opinion. They have assured us repeatedly that their next chipset revision will have full support and since they have had some time to work with the Pentium D, I believe them (reserve future rights to deny this statement). ;-)

    Thank you,
    Gary
  • johnsonx - Friday, September 23, 2005 - link

    Thanks for those responses Gary and Jarred. I'm inclined to agree with both of you - it's something in the mix of the 820 not having EIST and already being at 14x multiplier. After all, you already found that running an 830 or 840 at 14x results in single-core only mode, just like an 820 running at the stock multiplier of 14x. It may also be, as you imply Gary, that nVidia was more or less just lucky that they're able to support the Pentium D at all, by work done to support 600 series CPUs. They could easily have had ALL dual-core CPUs running in single core mode.

    I think though that nVidia's claim that they thought enthusiasts buying such a high-end board wouldn't bother with a low-end 820 CPU is just hogwash. That's after-the-fact cover-up, period.



  • Gary Key - Sunday, September 25, 2005 - link

    quote:

    I think though that nVidia's claim that they thought enthusiasts buying such a high-end board wouldn't bother with a low-end 820 CPU is just hogwash. That's after-the-fact cover-up, period.


    Johnsonx,

    Your opinion is also widely shared by several Intel enthusiasts who emailed me. As I stated in the article, we believe the lack of full support for the Pentium D 820 was a mistake by NVIDIA and they certainly plan on correcting it and the multiplier issue in the next chipset revision.
    This is one of the main reasons (along with the multiplier overclocking issue) we thought the Gigabyte board was a better choice. I just wanted to reiterate that MSI did an incredible job with the P4N Diamond (the just released 1.4 bios really makes this a sterling single-core Intel based gaming board) but as a dual-core user unless you do not plan on multiplier overclocking (these core speeds need it) and will only utilize the Pentium D 830/840 then going with an Intel 945/955 chipset is "currently" the best solution.

    Sincerely,
    Gary

  • JarredWalton - Friday, September 23, 2005 - link

    I've wondered the same thing. My only guess is that the 820 doesn't support EIST - since it's already at the minimum 14X multiplier. Perhaps there's some glitch in the chipset with EIST, and that also shows up when running an 820?

    NVIDIA's statement that they didn't expect anyone to use a "budget" processor in such an expensive mobo is a bit fishy. The claim that the 820 wasn't initially planned for production http://www.anandtech.com/cpuchipsets/showdoc.aspx?...">is even worse, in my view - we had information on that chip dating back to before December 04. If the nF4-SLI for Intel was complete already, it should have at least been revised.
  • AdamK47 3DS - Friday, September 23, 2005 - link

    Pretty good article. I was a little disappointed that AMDs name wasn't mentioned one hundred times in this Anandtech review.
  • rjm55 - Friday, September 23, 2005 - link

    You sound like a bitter Intel buyer who is upset no one is patting you on the back for making a wise purchase. As someone here said, there are lots of good reasongs to buy Intel, but Performance hasn't been one of them for over a year.

    Things will change, and when Intel is the Performance Leader again, or things are at least even - maybe late next year - then you can complain that Intel is mentioned 100 times in every AnandTech Review.
  • mikecel79 - Thursday, September 22, 2005 - link

    I glanced it briefly in the morning but now it's gone from the front page. All the links in my history point to the search page now. For some reason the comments are still here.
  • RandomFool - Friday, September 23, 2005 - link

    Oh good I thought I was losing my mind.
  • Wesley Fink - Friday, September 23, 2005 - link

    This article was put on hold while we were waiting for a reply from nVidia on the Dual-Core problems we found. Unfortunately our posting engine had a mind of its own and the article posted by mistake. We pulled the article as soon as we realized what had happenned.

    NVIDIA emailed and talked with us several times yesterday and today, and we learned the dual-core issues Gary found were real and known to nVidia. Once that was clear the article was revised and reposted. We apologize for the confusion of the yo-yo posting, but we were trying to get the full story for our readers before posting the article.

    Please welcome Gary Key as the newest reviewer at AnandTech. Gary did an incredible job in finding and quantifying this nVidia dual-core issue with their Intel chipset switching to single-core mode in overclocking. nVidia assures us this will be fixed in future Intel chipset releases.

Log in

Don't have an account? Sign up now