What a year it has been for AMD and ATI. We have watched both companies go through pain and agony since their merger, as they experienced what can only be described as a perfect storm: we witnessed the return of Intel to CPU dominance and NVIDIA basically owned the DirectX 10 market for seven months before AMD/ATI could respond. Unfortunately, we are still waiting on the CPU response to the Core 2 series and the DirectX 10 GPU response was pretty much too little and too late. While we expect AMD will rebound shortly in the CPU and GPU markets, they have been very quiet on the chipset front since the merger.

We have the upcoming RD790 that looks like it will be the chipset to beat when Phenom launches late this year, but for the most part AMD is content with the current 690G/690V chipset until their replacements launch sometime next year. We will be looking in-depth at several 690G boards in Part Three of our uATX roundup series shortly, but for today we will take a look at the abandoned and somewhat ignored cousin of the 690G, the RS600.

The RS600/RD600 were ATI's planned answer last year to the G965/P965 duo from Intel, but unfortunately neither chipset launched on schedule and the AMD buyout of ATI further complicated the marketing plans for these chipsets. Eventually, the RD600 was picked up by DFI and had some modest success as a fairly quick and extremely flexible chipset that was burdened somewhat by a memory controller that was just a little slower than the Intel P965/975X. Also, the lack of Tier-1 acceptance and any serious follow-up support from AMD more or less doomed ATI's last performance chipset for the Intel platform.

While ATI basically owned the Intel white box market for the last few years, the majority of that business has gone to SiS with Intel now also providing its fair share of chipsets into this large and lucrative market. This market is where the RS600 was originally designed to compete, along with a promise to stay a couple of steps ahead of Intel in both features and video performance. ATI succeeded to some extent, but that pesky merger and resulting product decisions meant this chipset has been fighting an uphill battle all the way to its launch this past spring. Once again, there were no Tier-1 suppliers lined up and questions about support and follow-on products dogged the RS600 when it was released by abit in the uniquely designed Fatality F-I90HD.


Our opinions about the basic performance of this chipset/motherboard were extremely positive at launch; however, it was as if somebody upstairs was out for ATI/AMD blood, as the boards started failing. We experienced it within a couple of weeks, as did others, and any positives that could have been made for marketing this chipset to other manufacturers probably fell on deaf ears. As it turns out, according to our sources at the time within abit, the initial production runs of the board had some quality issues and it was not a chipset problem. In fact, our second board has not failed after several hundred hours of daily usage and currently resides in one of our DVR machines.

Unfortunately, the initial BIOS support and commitment to solving issues by abit has waned over the past sixty days, and we feel the board deserves continued support. It was with great interest to us when ASRock called and wanted us to take a look at their new 4Core1333-FullHD based on the RS600. This board promises full 1080p playback capabilities and the latest support for Intel's recently introduced 1333MHz FSB processors, although this support means auto-overclocking the FSB to 333.

Now that we have two RS600 boards it would be quite easy to do a comparison of AMD's current Intel based IGP solution against the latest Intel offering, the G33. We have not been kind towards Intel's IGP solutions, as we feel like the continued minimum functionality in their graphics solutions (not to mention driver concerns) creates issues with developers wanting to move forward but always ensuring their products run on the lowest common denominator platform. That platform has historically been Intel-based as they are the world's largest graphics provider, although we wonder at times if they understand the importance of this fact.

This is not to say any of the AMD and NVIDIA IGP solutions are that much better; they are in many ways, but without earnest competition from Intel these solutions do just enough to stay ahead of Intel. However, at least these solutions provide a much higher degree of compatibility and performance with most games, video playback, and applications. While running the latest games such as Bioshock or Supreme Commander will require a resolution of 800x600 with medium-low quality settings, at least a user has the chance to play the game until they can afford a better performing video solution.

Of course, it is not all about gaming with these platforms, but even in video playback and general application performance we see better solutions from Intel's competitors. Hopefully, this will change with the upcoming G35 chipset from Intel, but we are not holding our breath.

Let's take a quick look at the specifications of the AMD RS600 chipset and its performance against a similarly priced G33 solution today.

Radeon Xpress 1250 Overview
Comments Locked

22 Comments

View All Comments

  • Sargo - Tuesday, August 28, 2007 - link

    Nice review but there's no X3100 on Intel G33. http://en.wikipedia.org/wiki/Intel_GMA#GMA_3100">GMA 3100 is based on much older arhitechture. Thus even the new drivers won't help that much.
  • ltcommanderdata - Tuesday, August 28, 2007 - link

    Exactly. The G33 was never intended to replace the G965 chipset, it replaces the 945G chipset and the GMA 950. The G33's IGP is not the GMA X3100 but the GMA 3100 (no "X") and the IGP is virtually identical to the GMA 950 but with higher clock speeds and better video support. The GMA 950, GMA 3000, and GMA 3100 all only have SM2.0 pixel shaders with no vertex shaders and no hardware T&L engine. The G965 and the GMA X3000 remains the top Intel IGP until the launch of the G35 and GMA X3500. I can't believe Anandtech made such an obvious mistake, but I have to admit Intel isn't helping matters with there ever expanding portfolio of IGPs.

    Here's Intel's nice PR chart explaining the different IGPs:

    http://download.intel.com/products/graphics/intel_...">http://download.intel.com/products/graphics/intel_...

    Could you please run a review with the G965 chipset and the GMA X3100 using XP and the latest 14.31 drivers? They are now out of beta and Intel claims full DX9.0c SM3.0 hardware acceleration. I would love to see the GMA X3000 compared with the common GMA 950 (also supported in the 14.31 drivers although it has no VS to activate), the Xpress X1250, the GeForce 6150 or 7050, and some low-end GPUs like the X1300 or HD 2400. A comparison between the 14.31 and previous 14.29 drivers that had no hardware support would also show how much things have increased.
  • JarredWalton - Tuesday, August 28, 2007 - link

    I did look at gaming performance under Vista with a 965GM chipset in the http://www.anandtech.com/mobile/showdoc.aspx?i=306...">PC Club ENP660 review. However, that was also tested under Vista. I would assume that with drivers working in all respects, GMA X3100 performance would probably come close to that of the Radeon Xpress 1250, but when will the drivers truly reach that point? In the end, IGP is still only sufficient for playing with all the details turned down at 1280x800 or lower resolutions, at least in recent titles. Often it can't even do that, and 800x600 might be required. Want to play games at all? Just spend the $120 on something like an 8600 GT.
  • IntelUser2000 - Wednesday, August 29, 2007 - link

    quote:

    I did look at gaming performance under Vista with a 965GM chipset in the PC Club ENP660 review. However, that was also tested under Vista. I would assume that with drivers working in all respects, GMA X3100 performance would probably come close to that of the Radeon Xpress 1250, but when will the drivers truly reach that point? In the end, IGP is still only sufficient for playing with all the details turned down at 1280x800 or lower resolutions, at least in recent titles. Often it can't even do that, and 800x600 might be required. Want to play games at all? Just spend the $120 on something like an 8600 GT.


    It has the drivers at XP.
  • JarredWalton - Wednesday, August 29, 2007 - link

    Unless the XP drivers are somehow 100% faster (or more) than the last Vista drivers I tried, it still doesn't matter. Minimum details in Battlefield 2 at 800x600 got around 20 FPS. It was sort of playable, but nothing to write home about. Half-Life 2 engine stuff is still totally messed up on the chipset; it runs DX9 mode, but it gets <10 FPS regardless of resolution.
  • IntelUser2000 - Wednesday, August 29, 2007 - link

    I get 35-45 fps on the demo Single Player for the first 5 mins at 800x600 min. Didn't check more as its limited.

    E6600
    DG965WH
    14.31 production driver
    2x1GB DDR2-800
    WD360GD Raptor 36GB
    WinXP SP2
  • IntelUser2000 - Tuesday, September 11, 2007 - link

    Jarred, PLEASE PROVIDE THE DETAILS OF THE BENCHMARK/SETTINGS/PATCHES used for BF2 so I can provide equal testing as you have done on the Pt.1 article.

    Like:
    -What version of BF2 used
    -What demos are supposed to be used
    -How do I load up the demos
    -etc
  • R101 - Tuesday, August 28, 2007 - link

    Just for the fun of it, for us to see what can X3100 do with these new betas. I've been looking for that test since those drivers came out, and still nothing.

  • erwos - Tuesday, August 28, 2007 - link

    I'm looking forward to seeing the benchmarks on the G35 motherboards (which I'm sure won't be in this series). The X3500 really does seem to have a promising feature set, at least on paper.
  • Lonyo - Tuesday, August 28, 2007 - link

    quote:

    This is not to say any of the AMD and NVIDIA IGP solutions are that much better; they are in many ways, but without earnest competition from Intel these solutions do just enough to stay ahead of Intel. However, at least these solutions provide a much higher degree of compatibility and performance with most games, video playback, and applications. While running the latest games such as Bioshock or Supreme Commander will require a resolution of 800x600 with medium-low quality settings, at least a user has the chance to play the game until they can afford a better performing video solution.


    quote:

    the R4x0 series fits the bill with its lack of SM3.0 support and use of 24-bit floating point precision. The basic design for the X1250 is taken from the X700, with some modifications. While we would love to see Shader Model 3.0 support (which current Intel hardware claims to be capable of in XP with the latest drivers), developers writing DX9 apps will still be designing for the SM2.0 target which the X1250 meets.



    Bioshock requires SM3.0.

Log in

Don't have an account? Sign up now