abit Fatality F-I90HD: FSB Overclocking

abit Fatality F-I90HD Overclocking Testbed
Processor Intel Pentium (Core 2 Based) E2160
Dual Core, 1.8GHz, 1MB Unified Cache, 9x Multiplier, 800FSB
CPU Voltage 1.4500V, 1.4750V
Cooling Scythe Ninja Mini
Power Supply SeaSonic S-12 II 430W
Memory OCZ HPC Reaper PC2-6400 (4x1GB)
Memory Settings DDR2-800 4-4-4-12 (2.00V)
Video Cards Gigabyte HD 2600XT
Video Drivers AMD 7.8
BIOS abit 1.4
Operating System Windows Vista Home Premium 32-bit
Max Overclock X1250 9x315 - 2843MHz - 1.4500V - 58% Overclock
Max Overclock HD 2600XT 9x350 - 3157MHz - 1.4750V - 75% Overclock
.

Click to enlarge

Our best results with the integrated graphics solutions and an E2160 was a final benchmark stable setting of 9x315 FSB resulting in a clock speed of 2843MHz, but it required a CPU voltage setting of 1.450V and Northbridge setting of 2.016V. The board was actually capable of running at 9x325 FSB but would consistently fail our game benchmarks. Vdroop was acceptable with a loss of around .02~.03V during load testing. Unlike the ASRock board, the BIOS does not support overclocking of the memory so we stayed with our best settings of 4-4-4-12 at DDR2-800.

Click to enlarge

Switching to discrete graphics, we had fairly good luck with this board up to 9x350 FSB and decided to stay at that level even though we could run up to 9x374 FSB. The reasoning behind this is that our first board fried itself after running at 9x370 during our benchmark testing. This appears to be a common issue with the early releases of the boards indicating the advertised quality components on the board are not exactly up to rigorous testing.

We think it is probably a combination of faulty manufacturing or component selection early on in production, plus the fact that PCI and PCI-E speeds are not adjustable and from all indications are somewhat tied synchronously to the FSB rates. Considering the BIOS options available in the ASRock board, we would have thought abit could have done better with the overclocking options available including additional memory timings and settings for the PCI and PCI-E clocks.

Our final 9x350FSB setting resulted in a CPU speed of 3157MHz with 1.475V required for the CPU and 1.872V for the Northbridge. Once again, memory speed stayed static at DDR2-800 with timings of 4-4-4-12. Our current board has over 400 hours of 24/7 benchmark testing at these settings so we feel comfortable with the voltages and settings utilized for this result. Resulting temps under load never exceeded 51C at these settings with idle temps hovering around 34C.

ASRock 4Core1333-FullHD: FSB Overclocking

ASRock 4Core1333-FullHD Overclocking Testbed
Processor Intel Pentium (Core 2 Based) E2160
Dual Core, 1.8GHz, 1MB Unified Cache, 9x Multiplier, 800FSB
CPU Voltage Auto set by ASRock
Cooling Scythe Ninja Mini
Power Supply SeaSonic S-12 II 430W
Memory OCZ HPC Reaper PC2-6400 (4x1GB)
Memory Settings DDR2-800 4-4-4-12 (2.00V)
Video Cards Gigabyte HD 2600XT
Video Drivers AMD 7.8
BIOS ASRock 1.30C (Preliminary results with 1.30)
Operating System Windows Vista Home Premium 32-bit
Max Overclock X1250 9x305 - 2745MHz - Auto Voltage - 52% Overclock (previous 1.30 BIOS)
Max Overclock HD 2600XT 9x343- 3087MHz - Auto Voltage - 71% Overclock (previous 1.30 BIOS)
.

This board is like the Tale of Two Cities, both good and bad depending on the BIOS utilized. We were able to reach a final benchmark stable setting of 9x305 FSB resulting in a clock speed of 2745MHz with the X1250 integrated graphics solution. The board reached a final 9x343 FSB setting with the HD 2600 XT installed. Unlike the abit board, this board offers the option to overclock the memory along with a standard DDR2-1066 setting, although buying memory that will do either is a waste if you use an E2160 CPU. That's the good news.

The bad news is the latest BIOS update that includes support for Pioneer's Blu-ray drive and also offers an improvement in disk and memory scores does not overclock for us. ASRock has been able to overclock their board with the latest BIOS, so we are shipping our test sample back for analysis.

There is more bad news if overclocking a board like this is interesting to you. The board does not support CPU multipliers and does not offer adjustable CPU voltages. The auto CPU voltage system does work to a certain degree as the board will increase voltages slightly, but the inability to go much over +0.05V means your overclocking rates will be subject to the quality of your CPU. Considering the wealth of BIOS options on this board that are designed with overclocking in mind, it is weird to us that CPU voltage options are not available.

ASRock 4Core1333-FullHD: Board Layout and Features Memory Testing
Comments Locked

22 Comments

View All Comments

  • Sargo - Tuesday, August 28, 2007 - link

    Nice review but there's no X3100 on Intel G33. http://en.wikipedia.org/wiki/Intel_GMA#GMA_3100">GMA 3100 is based on much older arhitechture. Thus even the new drivers won't help that much.
  • ltcommanderdata - Tuesday, August 28, 2007 - link

    Exactly. The G33 was never intended to replace the G965 chipset, it replaces the 945G chipset and the GMA 950. The G33's IGP is not the GMA X3100 but the GMA 3100 (no "X") and the IGP is virtually identical to the GMA 950 but with higher clock speeds and better video support. The GMA 950, GMA 3000, and GMA 3100 all only have SM2.0 pixel shaders with no vertex shaders and no hardware T&L engine. The G965 and the GMA X3000 remains the top Intel IGP until the launch of the G35 and GMA X3500. I can't believe Anandtech made such an obvious mistake, but I have to admit Intel isn't helping matters with there ever expanding portfolio of IGPs.

    Here's Intel's nice PR chart explaining the different IGPs:

    http://download.intel.com/products/graphics/intel_...">http://download.intel.com/products/graphics/intel_...

    Could you please run a review with the G965 chipset and the GMA X3100 using XP and the latest 14.31 drivers? They are now out of beta and Intel claims full DX9.0c SM3.0 hardware acceleration. I would love to see the GMA X3000 compared with the common GMA 950 (also supported in the 14.31 drivers although it has no VS to activate), the Xpress X1250, the GeForce 6150 or 7050, and some low-end GPUs like the X1300 or HD 2400. A comparison between the 14.31 and previous 14.29 drivers that had no hardware support would also show how much things have increased.
  • JarredWalton - Tuesday, August 28, 2007 - link

    I did look at gaming performance under Vista with a 965GM chipset in the http://www.anandtech.com/mobile/showdoc.aspx?i=306...">PC Club ENP660 review. However, that was also tested under Vista. I would assume that with drivers working in all respects, GMA X3100 performance would probably come close to that of the Radeon Xpress 1250, but when will the drivers truly reach that point? In the end, IGP is still only sufficient for playing with all the details turned down at 1280x800 or lower resolutions, at least in recent titles. Often it can't even do that, and 800x600 might be required. Want to play games at all? Just spend the $120 on something like an 8600 GT.
  • IntelUser2000 - Wednesday, August 29, 2007 - link

    quote:

    I did look at gaming performance under Vista with a 965GM chipset in the PC Club ENP660 review. However, that was also tested under Vista. I would assume that with drivers working in all respects, GMA X3100 performance would probably come close to that of the Radeon Xpress 1250, but when will the drivers truly reach that point? In the end, IGP is still only sufficient for playing with all the details turned down at 1280x800 or lower resolutions, at least in recent titles. Often it can't even do that, and 800x600 might be required. Want to play games at all? Just spend the $120 on something like an 8600 GT.


    It has the drivers at XP.
  • JarredWalton - Wednesday, August 29, 2007 - link

    Unless the XP drivers are somehow 100% faster (or more) than the last Vista drivers I tried, it still doesn't matter. Minimum details in Battlefield 2 at 800x600 got around 20 FPS. It was sort of playable, but nothing to write home about. Half-Life 2 engine stuff is still totally messed up on the chipset; it runs DX9 mode, but it gets <10 FPS regardless of resolution.
  • IntelUser2000 - Wednesday, August 29, 2007 - link

    I get 35-45 fps on the demo Single Player for the first 5 mins at 800x600 min. Didn't check more as its limited.

    E6600
    DG965WH
    14.31 production driver
    2x1GB DDR2-800
    WD360GD Raptor 36GB
    WinXP SP2
  • IntelUser2000 - Tuesday, September 11, 2007 - link

    Jarred, PLEASE PROVIDE THE DETAILS OF THE BENCHMARK/SETTINGS/PATCHES used for BF2 so I can provide equal testing as you have done on the Pt.1 article.

    Like:
    -What version of BF2 used
    -What demos are supposed to be used
    -How do I load up the demos
    -etc
  • R101 - Tuesday, August 28, 2007 - link

    Just for the fun of it, for us to see what can X3100 do with these new betas. I've been looking for that test since those drivers came out, and still nothing.

  • erwos - Tuesday, August 28, 2007 - link

    I'm looking forward to seeing the benchmarks on the G35 motherboards (which I'm sure won't be in this series). The X3500 really does seem to have a promising feature set, at least on paper.
  • Lonyo - Tuesday, August 28, 2007 - link

    quote:

    This is not to say any of the AMD and NVIDIA IGP solutions are that much better; they are in many ways, but without earnest competition from Intel these solutions do just enough to stay ahead of Intel. However, at least these solutions provide a much higher degree of compatibility and performance with most games, video playback, and applications. While running the latest games such as Bioshock or Supreme Commander will require a resolution of 800x600 with medium-low quality settings, at least a user has the chance to play the game until they can afford a better performing video solution.


    quote:

    the R4x0 series fits the bill with its lack of SM3.0 support and use of 24-bit floating point precision. The basic design for the X1250 is taken from the X700, with some modifications. While we would love to see Shader Model 3.0 support (which current Intel hardware claims to be capable of in XP with the latest drivers), developers writing DX9 apps will still be designing for the SM2.0 target which the X1250 meets.



    Bioshock requires SM3.0.

Log in

Don't have an account? Sign up now