Gigabyte 8N SLI Quad Royal: Overclocking

FSB Overclocking Results

Front Side Bus Overclocking Testbed
Processor: Pentium 4 Prescott LGA 775
840EE Dual Core 3.2GHz
CPU Voltage: 1.4125V (1.4V default)
Cooling: Gigabyte G-Power
Power Supply: OCZ Power Stream 520
Maximum CPU OverClock 245fsb x 16 (3920MHz) +23%
Maximum FSB OC: 272fsb x 14 (3808MHz) +36%

Oddly enough, the ability to increase the front side bus even further than stock when dropping the multiplier to 14 just wasn't there with the shipping BIOS on our test board. We contacted Gigabyte about this issue and they were able to implement changes within a new BIOS that allowed the 840EE to reach 272 fsb with a 14x multiplier.

Memory Stress Testing

Memory stress tests look at the ability of the Gigabyte 8N SLI Quad Royal to operate at the officially supported memory frequencies of 533MHz and 667MHz DDR2 at the best performing memory timings that the Corsair CM2X512A-5400UL revision 1.3 will support.

Gigabyte 8N SLI Quad Royal
Stable DDR533 Timings - 2 DIMMs
(2/4 slots populated - 1 Dual-Channel Bank)
Clock Speed: 200MHz (800FSB)
Timing Mode: 533MHz - Default
CAS Latency: 3
RAS to CAS Delay: 2
RAS Precharge: 2
RAS Cycle Time: 4
Voltage: 2.1V
Command Rate: 1

The Gigabyte 8N SLI Quad Royal was completely stable with 2 DDR2 modules in Dual-Channel at the settings of 3-2-2-4 at 2.1V.

Filling all four available memory slots is usually more strenuous on the memory subsystem than testing 2 DDR2 modules on a motherboard.

Gigabyte 8N SLI Quad Royal
Stable DDR533 Timings - 4 DIMMs
(4/4 slots populated - 2 Dual-Channel Banks)
Clock Speed: 200MHz (800FSB)
Timing Mode: 533MHz - Default
CAS Latency: 3
RAS to CAS Delay: 2
RAS Precharge: 2
RAS Cycle Time: 4
Voltage: 2.1V
Command Rate: 1

The Gigabyte 8N SLI Quad Royal was completely stable with 4 DDR2 modules in Dual-Channel at the settings of 3-2-2-4.

We will now increase the memory frequencies to 667MHZ to see what effect this change has on the memory timings and stability of the board.

Gigabyte 8N SLI Quad Royal
Stable DDR667 Timings - 2 DIMMs
(2/4 slots populated - 1 Dual-Channel Bank)
Clock Speed: 200MHz (800FSB)
Timing Mode: 667MHz - Default
CAS Latency: 3
RAS to CAS Delay: 3
RAS Precharge: 3
RAS Cycle Time: 8
Voltage: 2.1V
Command Rate: 1

The Gigabyte 8N SLI Quad Royal was completely stable with 2 DDR2 modules in Dual-Channel at the settings of 3-3-3-8 at 2.1V and leaving the Command Rate at 1. We will now try 4 modules at this speed.

Gigabyte 8N SLI Quad Royal
Stable DDR667 Timings - 2 DIMMs
(4/4 slots populated - 2 Dual-Channel Bank)
Clock Speed: 200MHz (800FSB)
Timing Mode: 667MHz - Default
CAS Latency: 4
RAS to CAS Delay: 4
RAS Precharge: 4
RAS Cycle Time: 16
Voltage: 2.2V
Command Rate: 1

The Gigabyte 8N SLI Quad Royal was completely stable with 4 DDR2 modules in Dual-Channel at the settings of 4-4-4-16 at 2.2V with the Command Rate left at 1. This resulted in a slight decrease in benchmark scores during testing, but did not have near the effect of decreasing the scores when changing the command rate to 2. The command rate had to be switched to 2 once we passed the 675MHz level on this board.

Gigabyte 8N SLI Quad Royal: Features Test Setup
Comments Locked

44 Comments

View All Comments

  • DrMrLordX - Thursday, October 13, 2005 - link

    Fine, I'll retract my statement, at least partially. I wasn't reading the statement carefully enough.

    Having looked into the newer 3D1-68GT, it seems to be a more solid product than the original 3D1 card based on 6600s. The original seemed to serve no purpose whatsoever.

  • Calin - Thursday, October 13, 2005 - link

    They made it an Intel board assuming that the more "corporate-oriented" users prefer multiple monitors. I don't know about current performance, but in the recent past, Intel processors smoked the Athlon64 at things like Photoshop. And introduction of dual core processors at prices much lower than AMD's dual core could coax someone into buying such a board.
    I agree that most every normal person would be happy with four processors (powered by two cards), however I remember cases (in Linux) when OpenGL performace fell at half when enabling 2 monitor support on a single video card. This is driving a single monitor, not two. Driving two monitors, it fell even lower.
    So, for every person that WANTS (not that it really really would need) four monitor output from four video cards, this looks like the best choice
  • trooper11 - Thursday, October 13, 2005 - link

    quote:

    And introduction of dual core processors at prices much lower than AMD's dual core could coax someone into buying such a board


    I kind of doubt that since the cost in video equipment does not make this a low cost solution. if a company is willing to shell out for that, they would be willing to shell out for the best in workstation performance, which just happens to be the X2s
  • ElJefe - Wednesday, October 12, 2005 - link

    ever wonder what crack they were smoking though making it an Intel board?

    if you read about modders and gamers , almost 80%+ market share for DIY builders use AMD.

    this board is a waste of technology.

    still cool though.
  • Gary Key - Thursday, October 13, 2005 - link

    Hi,

    The ability to produce this board was due to Nvidia's decision to use a HyperTransport link for the Intel SLI chipset due to the need to have an on-chip memory controller. While it would be feasible to complete a AMD version of the board, the engineering time and product cost would not be acceptable. While I will agree with everyone that the current AMD processor line up offers significantly more performance than Intels, the actual day to day real life experience with both systems is not readily apparent to most people. In fact, I have had people play on my FX55 machine and 840EE machine and nobody could decide clearly which system had the AMD64 in it without benchmarks. This was at both 1280x1024 and 1600x1200 resolutions. While I personally favor AMD for most performance oriented setups, there are some people that still want Intel. After not having an Intel based machine for the last two plus years I have to admit is not as bad as most people make it out to be.
  • Johnmcl7 - Wednesday, October 12, 2005 - link

    Whether you like it or not, the 3D1 was an innovative product, it's not childspplay to stuff both cores together and develop the motherboard support for it.

    John
  • Viper20220k - Wednesday, October 12, 2005 - link

    Yeah, what is up with that.. I would sure like to know also.
  • Wesley Fink - Wednesday, October 12, 2005 - link

    The pictures of the 10-monitor display were supplied, but Gary did hook up every monitor we could, which was 8 if I recall, to test the outputs. To test 10, we needed two more Rev. 2 3D1 cards - our extra pair were Rev. 1 cards - which couldn't be here in time for a review.

    We did verify the ability of the individual 3D1 cards to do what Gigabyte claimed, so there is no reason at all to doubt the 10 claim. One of the key Engineers at Gigabyte works exclusively with AT and THG. All sites use some pictures and diagrams from press kits to save time, but we perform and report our own test results and analysis.

    Yes, we dis ALL of the testing ourselves. Our review took longer because we did much more extensive testing of the board, including quite a bit of overclocking tests to make sure the nVidia dual-core issue we reported in our last Intel SLI review is now fixed in this chipset.

    Gary spent countless hours sniffing out the good and the not so good on this board. We also found the OC capabilities of the shipping BIOS not too exciting, and we wanted to bring you the much improved OC results from the revised BIOS.
  • johnsonx - Wednesday, October 12, 2005 - link

    Wesley,

    I don't think most of your readers actually thought what the subject line of this thread implies. There are always a few who like to throw stones of course.

    In my read of the THG article a few days ago, I found myself thinking that the 10-display shot was from Gigabyte, as they had no detail shots of the display control panel for 10 monitors; nine was the most they were able to get working.

    Like you, I have little doubt that 10 displays will in fact work with this board, but the 9th and 10th would have to come from either a PCI card or a PCIe card running in a x1 slot. Even x1 PCIe is faster than crusty old PCI, but it's still hardly ideal. It'd be nice if 3D1 cards could be coaxed into working in x8 slots (so that'd be 4 PCIe lanes per core - still plenty), as then you could theoretically have 4 3D1 cards for 16(!) displays.

    Thanks for the information on how you did the review testing.

    Regards,

    Dave
  • AmberClad - Wednesday, October 12, 2005 - link

    April Fool's Day already?!

Log in

Don't have an account? Sign up now