ASRock 4Core1333-FullHD: Board Layout and Features


ASRock has engineered a nice board with a good layout that only has a couple of problems. The board easily installs into a variety of micro-ATX cases and most connections are easily reached. The board features a four-phase voltage regulator system that provided excellent stability throughout our testing. Unlike the abit board that uses a combination of Conductive Polymer Aluminum Solid Capacitors and Electrolytic Capacitors, this board uses all Electrolytic Capacitors.

The board only comes with two fan headers with the CPU fan header being located to the right above the first DIMM slot. We usually prefer a minimum of three and preferably four or more fan headers on a motherboard in this category. Only the CPU fan header can be controlled via the BIOS and ASRock does not offer a Windows based utility for monitoring or controlling it.

Around the CPU socket area, we find ample room for the majority of cooling solutions. We utilized the stock heatsink/fan in our base testing but also verified several aftermarket Socket-775 cooling solutions would fit in this area during our overclocking tests. The 4-pin ATX power connector is placed on the left side of the board and did not interfere with our various cooling units.

The rear panel contains the standard PS/2 mouse and keyboard ports along with a parallel port for those who still use legacy peripherals. The panel also includes a LAN port, four USB ports, and an IEEE-1394 port. The audio panel consists of six ports that can be configured for 2, 4, 6, and 8-channel audio connections for the Realtek ALC888 HD codec. The board also has DVI and D-Sub ports for video out capabilities. If you use the DVI port, the PS/2 keyboard port will be blocked for those using a USB to PS/2 adapter.


The DIMM module slots' color coordination is correct for dual channel setup based upon the premise of installing DIMMs in the same colored slots for dual-channel operation. The memory modules are very difficult to install with a full size video card placed in the PCI Express x16 slot. The 20-pin ATX power connector is located above the Northbridge and below the IC panel. Cable routing to this location proved to be a little difficult with our Scythe Mini-Ninja cooler installed. The black floppy drive connector is located in between the last PCI slot and the PCI-E x1 slot making for an interesting cable placement if all of the slots are full.

We found the positioning of the four SB600 red SATA ports to be excellent when utilizing the PCI slots. The RS600 and SB600 chipsets are passively cooled and remained fairly cool to the touch throughout testing.

The board comes with one physical PCI Express x16 connector, one PCI Express x1 connector, two PCI 2.2 connectors, and an HDMR slot for those unfortunate enough to still be using a modem. The first PCI slot will be blocked by a dual slot graphics card. The first PCI Express x1 slot is a tight fit as a card installed in this slot will have minimal clearance between the MCH heatsink and video card.


Our choice of the Scythe Mini-Ninja heatsink for the overclocking tests was correct as it fit this board properly. Our only modification was to move the fan up slightly in order to clear the Northbridge heatsink but otherwise the unit performed very well. We will cover this in an article on building a uATX system, but this heatsink did not work on our Silverstone SG-03 mATX case due to the height restrictions. Otherwise, this heatsink worked with the majority of our cases and performed admirably.


One of the main issues we had with our selection of test components was the fact that our passively cooled MSI 8600 GTS card would not work in conjunction with our OCZ HPC Reaper memory. The heatpipes on the memory are too tall for our MSI card to fit. We had two solutions: either change the memory to a selection with a standard heat spreader design or find another 8600 GTS card. Fortunately, we had a new Galaxy 8600 GTS card that we utilized in testing on this board.

ASRock 4Core1333-FullHD: Feature Set Overclocking Results
Comments Locked

22 Comments

View All Comments

  • Sargo - Tuesday, August 28, 2007 - link

    Nice review but there's no X3100 on Intel G33. http://en.wikipedia.org/wiki/Intel_GMA#GMA_3100">GMA 3100 is based on much older arhitechture. Thus even the new drivers won't help that much.
  • ltcommanderdata - Tuesday, August 28, 2007 - link

    Exactly. The G33 was never intended to replace the G965 chipset, it replaces the 945G chipset and the GMA 950. The G33's IGP is not the GMA X3100 but the GMA 3100 (no "X") and the IGP is virtually identical to the GMA 950 but with higher clock speeds and better video support. The GMA 950, GMA 3000, and GMA 3100 all only have SM2.0 pixel shaders with no vertex shaders and no hardware T&L engine. The G965 and the GMA X3000 remains the top Intel IGP until the launch of the G35 and GMA X3500. I can't believe Anandtech made such an obvious mistake, but I have to admit Intel isn't helping matters with there ever expanding portfolio of IGPs.

    Here's Intel's nice PR chart explaining the different IGPs:

    http://download.intel.com/products/graphics/intel_...">http://download.intel.com/products/graphics/intel_...

    Could you please run a review with the G965 chipset and the GMA X3100 using XP and the latest 14.31 drivers? They are now out of beta and Intel claims full DX9.0c SM3.0 hardware acceleration. I would love to see the GMA X3000 compared with the common GMA 950 (also supported in the 14.31 drivers although it has no VS to activate), the Xpress X1250, the GeForce 6150 or 7050, and some low-end GPUs like the X1300 or HD 2400. A comparison between the 14.31 and previous 14.29 drivers that had no hardware support would also show how much things have increased.
  • JarredWalton - Tuesday, August 28, 2007 - link

    I did look at gaming performance under Vista with a 965GM chipset in the http://www.anandtech.com/mobile/showdoc.aspx?i=306...">PC Club ENP660 review. However, that was also tested under Vista. I would assume that with drivers working in all respects, GMA X3100 performance would probably come close to that of the Radeon Xpress 1250, but when will the drivers truly reach that point? In the end, IGP is still only sufficient for playing with all the details turned down at 1280x800 or lower resolutions, at least in recent titles. Often it can't even do that, and 800x600 might be required. Want to play games at all? Just spend the $120 on something like an 8600 GT.
  • IntelUser2000 - Wednesday, August 29, 2007 - link

    quote:

    I did look at gaming performance under Vista with a 965GM chipset in the PC Club ENP660 review. However, that was also tested under Vista. I would assume that with drivers working in all respects, GMA X3100 performance would probably come close to that of the Radeon Xpress 1250, but when will the drivers truly reach that point? In the end, IGP is still only sufficient for playing with all the details turned down at 1280x800 or lower resolutions, at least in recent titles. Often it can't even do that, and 800x600 might be required. Want to play games at all? Just spend the $120 on something like an 8600 GT.


    It has the drivers at XP.
  • JarredWalton - Wednesday, August 29, 2007 - link

    Unless the XP drivers are somehow 100% faster (or more) than the last Vista drivers I tried, it still doesn't matter. Minimum details in Battlefield 2 at 800x600 got around 20 FPS. It was sort of playable, but nothing to write home about. Half-Life 2 engine stuff is still totally messed up on the chipset; it runs DX9 mode, but it gets <10 FPS regardless of resolution.
  • IntelUser2000 - Wednesday, August 29, 2007 - link

    I get 35-45 fps on the demo Single Player for the first 5 mins at 800x600 min. Didn't check more as its limited.

    E6600
    DG965WH
    14.31 production driver
    2x1GB DDR2-800
    WD360GD Raptor 36GB
    WinXP SP2
  • IntelUser2000 - Tuesday, September 11, 2007 - link

    Jarred, PLEASE PROVIDE THE DETAILS OF THE BENCHMARK/SETTINGS/PATCHES used for BF2 so I can provide equal testing as you have done on the Pt.1 article.

    Like:
    -What version of BF2 used
    -What demos are supposed to be used
    -How do I load up the demos
    -etc
  • R101 - Tuesday, August 28, 2007 - link

    Just for the fun of it, for us to see what can X3100 do with these new betas. I've been looking for that test since those drivers came out, and still nothing.

  • erwos - Tuesday, August 28, 2007 - link

    I'm looking forward to seeing the benchmarks on the G35 motherboards (which I'm sure won't be in this series). The X3500 really does seem to have a promising feature set, at least on paper.
  • Lonyo - Tuesday, August 28, 2007 - link

    quote:

    This is not to say any of the AMD and NVIDIA IGP solutions are that much better; they are in many ways, but without earnest competition from Intel these solutions do just enough to stay ahead of Intel. However, at least these solutions provide a much higher degree of compatibility and performance with most games, video playback, and applications. While running the latest games such as Bioshock or Supreme Commander will require a resolution of 800x600 with medium-low quality settings, at least a user has the chance to play the game until they can afford a better performing video solution.


    quote:

    the R4x0 series fits the bill with its lack of SM3.0 support and use of 24-bit floating point precision. The basic design for the X1250 is taken from the X700, with some modifications. While we would love to see Shader Model 3.0 support (which current Intel hardware claims to be capable of in XP with the latest drivers), developers writing DX9 apps will still be designing for the SM2.0 target which the X1250 meets.



    Bioshock requires SM3.0.

Log in

Don't have an account? Sign up now