Radeon Xpress 1250 Overview

The Radeon Xpress 1250 platform consists of an RS600 Northbridge and SB600 Southbridge. AMD's - let's just say ATI's - original intent with this chipset is to provide an attractive alternative to the Intel G965 and now G33 family. The Radeon Xpress 1250 is directed towards the consumer market with a heavy emphasis on multimedia capabilities via the X1250 graphics core with AVIVO.

In the case of the X1250, it is no surprise that AMD has reached back to previous generation hardware for the base design of their new integrated GPU. Lower transistor count means smaller die size and lower cost, and the R4x0 series fits the bill with its lack of SM3.0 support and use of 24-bit floating point precision. The basic design for the X1250 is taken from the X700, with some modifications. While we would love to see Shader Model 3.0 support (which current Intel hardware claims to be capable of in XP with the latest drivers), developers writing DX9 apps will still be designing for the SM2.0 target which the X1250 meets.

Many AVIVO features (including 10-bit per component processing) have been implemented on X1250, bringing higher quality video decoding to integrated graphics. Unfortunately, with this improvement comes some sacrifice, as the number of pipelines on the X1250 is cut down from the X700. The X1250 weighs in at four pixel shaders and like other R4x0 series hardware this also means 4 texture units, z-samples, and pixels per clock. An even bigger change when compared to the X700 is that the number of vertex shader units has gone from six to zero. All vertex shader operations are handled by the CPU. The core clock speed operates at 400MHz and is not adjustable in current configurations.

As for memory, the GPU can handle up to 512MB of memory, but support is once again dependent on the BIOS. AMD uses an optimized unified memory architecture (UMA) design, and all graphics memory is shared with system memory. For our tests, we found 256MB to be the sweet spot, as performance was not any different with 512MB of graphics memory, especially under Vista where the base memory requirements are significantly higher than XP. This may end up being different depending on implementation, but we will stick with the 256MB recommendation for now.

Looking beyond the architecture, most people who will actually be using integrated graphics won't be bothered with games or high-end 3D applications. This hardware will mostly be used for 2D and video applications. Let's take a look at the features we can expect in these areas.

Supporting a maximum resolution of 2048x1536, the X1250 can easily run any typical CRT at its maximum resolution. This matches Intel's G965 and revised G33 graphics core. Larger 30" flat panel monitors won't be able to run at native resolution, so the business user who needs huge desktop real estate will have to stick with add-in graphics cards. As for output features, the video hardware supports YPbPr, HDMI 1.2, and DVI. Of course, the actual interfaces available will depend on the implementation, but HDMI and DVI ports will also support HDCP 1.1.

The GPU supports two independent display outputs, and both DVI and HDMI outputs can be used at the same time. The only caveat is that HDCP will only work over one digital output at a time. This isn't a huge issue, as most people won't be watching two different protected movies at the same time on a single computer. Also, in spite of the single display limitation, HDCP can be used over either HDMI or DVI.

As for HDMI, the audio support is enabled through an interface in the RS600 Northbridge while the SB600 Southbridge handles the HD audio controller interface. The standard HD audio codec is supplied by Realtek. The HDMI audio solution is capable of 32, 44.1 and 48kHz, 2 channel + AC3 (5.1) output.

For video acceleration features, the X1250 is capable of hardware acceleration of MPEG-2 and WMV9 playback. MPEG-4 playback decode is not hardware accelerated, but it is supported in software via the driver. DVD and TV (both SD and HD resolution) playback can be offloaded from the CPU, but we have seen some playback issues with HD media formats at 1080p with processors slower than an E6420.

For those who wish to use discrete graphics alongside their integrated solution, AMD supports a feature they call Surround View. This enables support for three independent monitors in systems with integrated and discrete AMD graphics. The feature works as advertised and may be useful for business users who want more than two monitors at a low price. Gamers who want more than two monitors will certainly have to take a different route.

Index abit Fatality F-I90HD: Feature Set
Comments Locked

22 Comments

View All Comments

  • Sargo - Tuesday, August 28, 2007 - link

    Nice review but there's no X3100 on Intel G33. http://en.wikipedia.org/wiki/Intel_GMA#GMA_3100">GMA 3100 is based on much older arhitechture. Thus even the new drivers won't help that much.
  • ltcommanderdata - Tuesday, August 28, 2007 - link

    Exactly. The G33 was never intended to replace the G965 chipset, it replaces the 945G chipset and the GMA 950. The G33's IGP is not the GMA X3100 but the GMA 3100 (no "X") and the IGP is virtually identical to the GMA 950 but with higher clock speeds and better video support. The GMA 950, GMA 3000, and GMA 3100 all only have SM2.0 pixel shaders with no vertex shaders and no hardware T&L engine. The G965 and the GMA X3000 remains the top Intel IGP until the launch of the G35 and GMA X3500. I can't believe Anandtech made such an obvious mistake, but I have to admit Intel isn't helping matters with there ever expanding portfolio of IGPs.

    Here's Intel's nice PR chart explaining the different IGPs:

    http://download.intel.com/products/graphics/intel_...">http://download.intel.com/products/graphics/intel_...

    Could you please run a review with the G965 chipset and the GMA X3100 using XP and the latest 14.31 drivers? They are now out of beta and Intel claims full DX9.0c SM3.0 hardware acceleration. I would love to see the GMA X3000 compared with the common GMA 950 (also supported in the 14.31 drivers although it has no VS to activate), the Xpress X1250, the GeForce 6150 or 7050, and some low-end GPUs like the X1300 or HD 2400. A comparison between the 14.31 and previous 14.29 drivers that had no hardware support would also show how much things have increased.
  • JarredWalton - Tuesday, August 28, 2007 - link

    I did look at gaming performance under Vista with a 965GM chipset in the http://www.anandtech.com/mobile/showdoc.aspx?i=306...">PC Club ENP660 review. However, that was also tested under Vista. I would assume that with drivers working in all respects, GMA X3100 performance would probably come close to that of the Radeon Xpress 1250, but when will the drivers truly reach that point? In the end, IGP is still only sufficient for playing with all the details turned down at 1280x800 or lower resolutions, at least in recent titles. Often it can't even do that, and 800x600 might be required. Want to play games at all? Just spend the $120 on something like an 8600 GT.
  • IntelUser2000 - Wednesday, August 29, 2007 - link

    quote:

    I did look at gaming performance under Vista with a 965GM chipset in the PC Club ENP660 review. However, that was also tested under Vista. I would assume that with drivers working in all respects, GMA X3100 performance would probably come close to that of the Radeon Xpress 1250, but when will the drivers truly reach that point? In the end, IGP is still only sufficient for playing with all the details turned down at 1280x800 or lower resolutions, at least in recent titles. Often it can't even do that, and 800x600 might be required. Want to play games at all? Just spend the $120 on something like an 8600 GT.


    It has the drivers at XP.
  • JarredWalton - Wednesday, August 29, 2007 - link

    Unless the XP drivers are somehow 100% faster (or more) than the last Vista drivers I tried, it still doesn't matter. Minimum details in Battlefield 2 at 800x600 got around 20 FPS. It was sort of playable, but nothing to write home about. Half-Life 2 engine stuff is still totally messed up on the chipset; it runs DX9 mode, but it gets <10 FPS regardless of resolution.
  • IntelUser2000 - Wednesday, August 29, 2007 - link

    I get 35-45 fps on the demo Single Player for the first 5 mins at 800x600 min. Didn't check more as its limited.

    E6600
    DG965WH
    14.31 production driver
    2x1GB DDR2-800
    WD360GD Raptor 36GB
    WinXP SP2
  • IntelUser2000 - Tuesday, September 11, 2007 - link

    Jarred, PLEASE PROVIDE THE DETAILS OF THE BENCHMARK/SETTINGS/PATCHES used for BF2 so I can provide equal testing as you have done on the Pt.1 article.

    Like:
    -What version of BF2 used
    -What demos are supposed to be used
    -How do I load up the demos
    -etc
  • R101 - Tuesday, August 28, 2007 - link

    Just for the fun of it, for us to see what can X3100 do with these new betas. I've been looking for that test since those drivers came out, and still nothing.

  • erwos - Tuesday, August 28, 2007 - link

    I'm looking forward to seeing the benchmarks on the G35 motherboards (which I'm sure won't be in this series). The X3500 really does seem to have a promising feature set, at least on paper.
  • Lonyo - Tuesday, August 28, 2007 - link

    quote:

    This is not to say any of the AMD and NVIDIA IGP solutions are that much better; they are in many ways, but without earnest competition from Intel these solutions do just enough to stay ahead of Intel. However, at least these solutions provide a much higher degree of compatibility and performance with most games, video playback, and applications. While running the latest games such as Bioshock or Supreme Commander will require a resolution of 800x600 with medium-low quality settings, at least a user has the chance to play the game until they can afford a better performing video solution.


    quote:

    the R4x0 series fits the bill with its lack of SM3.0 support and use of 24-bit floating point precision. The basic design for the X1250 is taken from the X700, with some modifications. While we would love to see Shader Model 3.0 support (which current Intel hardware claims to be capable of in XP with the latest drivers), developers writing DX9 apps will still be designing for the SM2.0 target which the X1250 meets.



    Bioshock requires SM3.0.

Log in

Don't have an account? Sign up now