X1250 vs. GMA X3100

For the Graphics comparison, we decided to focus on the best case scenario for AMD and compare performance using the TL-66 to the Intel GMA X3100. We will include charts later that show that the difference between using a TL-60 and a TL-66 isn't particularly great when talking about gaming performance, but we just wanted to make this point clear.

The Radeon X1250 is based off of the Radeon X700 hardware, with a few changes. First, half the pipelines have been removed (which actually makes the hardware more like an X300/X600), so it comes with four pixel pipelines. All of the vertex shader pipelines have also been removed, letting the CPU handle that part of the graphics equation. Note also that the origins of this IGP mean that it lacks support for Shader Model 3.0, but unlike the X300/X600 it does include SM2.0b support. The X1250 also includes some additional functionality related to video processing, although we won't be testing that area of performance in this article.

For the Intel camp, the GM965 Northbridge includes the GMA X3100 graphics processor. Figuring out exactly what is and isn't supported by this chip can be a little complex, in part because the drivers have been so bad (at least in terms of gaming support), particularly under Windows Vista. We can say for sure that the GMA X3100 supports at least a subset of SM3.0, because it is able to complete that section of Futuremark's 3DMark06, and it appears to be capable of running certain SM3.0 games. In terms of features, that theoretically moves the X3100 ahead of the X1250, and it should also be better than the GMA 3000/3100 that's found in the Q33/Q35/G33 chipsets.

In the past, the assumption has always been that NVIDIA and ATI/AMD integrated graphics solutions were superior to the stuff from Intel (as well as smaller chipset companies like VIA). We want to determine a couple of things in this article: first, does that still hold true (at least in the mobile market)? Second, even if AMD Radeon Mobility X1250 (in this case) is faster than GMA X3100, does it even matter? In other words, is the performance provided enough to actually run certain applications (games) that fail to run on competing hardware?

To help answer this second question, we will also be including gaming performance results from a Gateway laptop (E-475M) that includes a Radeon Mobility HD 2300 discrete graphics chip. We're not yet ready to complete our review of the Gateway laptop, but we should have that ready within the next couple of weeks. The discrete graphics chip adds about $80 to the price of the laptop, which isn't too bad provided the performance increase is substantial. The Gateway E-475M was also equipped with a T7300 and 2GB of memory, so it ends up acting as the discrete GPU version of the dv6500t (which is also an option from HP).

Rather than starting with tons of graphs, we thought it would be easiest to just use a table to summarize the performance differences. All games tested were run at either low (or in some cases very low) and medium detail settings indicated by LQ/VLQ and MQ in the following table. Generally speaking, low-quality means that we turned everything off, although in some games that provide a VLQ/Minimum detail setting we may still run at low-quality if performance is acceptable. Medium quality sets everything to a middle value where possible. Here are the results.

GPU Performance Comparison
Game X1250 vs.
GMA X3100
HD2300 vs.
GMA X3100
HD2300 vs.
X1250
Battlefield 2 LQ -10.25% 200.33% 238.19%
Battlefield 2 MQ 3.01% 161.00% 155.28%
Bioshock N/A N/A N/A
Company of Heroes LQ 15.01% 146.80% 113.97%
Far Cry LQ 55.97% 245.60% 124.73%
Far Cry MQ 62.78% 270.93% 128.66%
FEAR LQ 29.70% 108.74% 60.88%
HL2: Episode One LQ 52.33% 234.92% 119.75%
HL2: Episode One MQ 31.16% 115.90% 64.72%
HL2: Lost Coast LQ 50.58% 219.22% 111.81%
HL2: Lost Coast MQ 26.96% 160.30% 104.86%
Quake 4 VLQ 9.74% 269.64% 235.14%
Quake 4 MQ -25.38% 188.24% 286.27%
Oblivion LQ N/A N/A 136.37%
STALKER LQ 8.41% 180.88% 158.89%
Supreme Commander LQ -11.64% 46.73% 66.04%
.
Average Performance Change (LQ) 22.21% 183.65% 136.60%
Average Performance Change (MQ) 19.71% 179.27% 147.96%
Average Performance Change (Total) 21.31% 182.09% 140.37%

Starting with the IGP comparison, we find that AMD does indeed continue to place ahead of Intel in overall performance. Somewhat interesting to note, however, is that Intel does manage to run a couple of games faster. Supreme Commander is extremely CPU intensive, which may help to explain that particular result, but most of the remaining games should be pretty much GPU bottlenecked. Battlefield 2 was at one point completely unable to run on the GMA X3100, as were many of the other games. Over the past several months, Intel has continued to improve the drivers and we're now at the point where nearly all of the games ran without issue. Battlefield 2 at medium quality still had some graphical artifacts, so that result should be disqualified, but performance at medium quality is too slow regardless.

Given the improvements we've seen with updated drivers, we would actually go so far as to say that Intel could probably be equal to or slightly faster than the X1250 with the GMA X3100 if they could only optimize their drivers further. That may be surprising to hear, but in reality the Intel GPU has as many pipelines as the Radeon X1250, and current results in Battlefield 2, 3DMark, and a few other titles indicate that there's still untapped potential. Performance under Windows XP tends to be even better, as those drivers are more fully developed. (We will include results from a laptop running Windows XP using the GMA X3100 in a forthcoming article.) As it stands, however, AMD still has about a 20% performance advantage in the IGP sector. That really isn't much, especially considering the relatively low frame rates we're already talking about.

The 20% performance lead looks even less impressive in light of the performance of the Radeon Mobility HD 2300. Frankly, the HD 2300 still isn't particularly fast, and most games need to be run at medium or low detail levels in order to achieve acceptable frame rates at resolutions up to 1280x800. However, while the performance of the HD 2300 might pale in comparison to faster desktop offerings, it generally turns in performance figures that are two or three times faster than either of the IGPs we're looking at today. It also offers complete SM3.0 support along with DirectX 10 capability, though not surprisingly the DX10 support is more of a feature checklist item than anything truly useful right now - of the few DX10 enabled games currently available, most cause pretty severe performance drops even on top-end hardware like the GeForce 8800 and Radeon HD 2900.

Our conclusion as usual is that for $80 more, anyone that actually intends to play any 3D games on a laptop should at least invest in an entry level discrete GPU. Even better would be a midrange HD 2600 or GeForce 8600M/8700M, though those tend to be found in laptops that cost closer to $1500 (barring sales and other special offers - as usual, shop around). You can look at the detailed performance charts to see exactly how slow the IGP solutions run, but there are several titles that are completely unplayable even at minimum detail settings. There are also games like Bioshock and Ghost Recon: Advanced Warfighter that require SM3.0 and are incapable of running properly on either of these two IGPs. (We did try the SM2.0 hack to get Bioshock to run, but the results weren't pretty to say the least.)

AMD vs. Intel Futuremark Performance
POST A COMMENT

33 Comments

View All Comments

  • tomycs - Sunday, December 09, 2007 - link

    Since we talk about bargains i guess a comparison between the previous generation mid-range (Geforce 7600, ATI X1600) and the entry level graphic chips (Geforce 8400, AMD/ATI HD2300) would have been nice.
    I find myself choosing between 2 HP's almost equal specs (almost no differences between AMDX2 TL60 and Intel T5500) but one with ATI X1600 and the other with the 8400GS. I'm almost sure i will takle the X1600 because of build quality and screen, but i would have liked some numbers regarding 3D performance.
    Reply
  • mobileuser2007 - Wednesday, October 10, 2007 - link

    Nice summary Jarred.
    I was a little surprised to not see anything about video quality. I, for one, don't do any gaming on my notebook but I do watch DVD movies while traveling. It seems the only way AT measures the success of "graphics" is how well they play games. Any thoughts on comparing systems on other visual aspects?
    Reply
  • JarredWalton - Wednesday, October 10, 2007 - link

    I guess the real problem is that I think most laptop LCDs suck, which means that even if the video card does an excellent job at decoding DVDs or whatever, the display quality makes this a moot point. I didn't think the 6515b was any better or worse than the dv6500t (or any other notebook, really) when it comes to DVD playback. Of course, you can always just get a different DVD decoder application that can make a big difference. DVD decoding is now at the point where the CPU can do all the work and still only put a moderate load on a CPU, even with higher complexity decoding algorithms that improve image quality.

    Maybe I didn't pay enough attention, though, so I'll see if I can notice any difference with additional testing.
    Reply
  • magao - Tuesday, October 09, 2007 - link

    Thank you very much for this article.

    I've been looking for a new laptop for the several months, and have almost settled on one of the 6515b, 6510b (if I can find one in Australia) or (most likely) the 6710b.

    I've been searching for months trying to find comparisons of the laptops with anything near the configuration I'm looking at (T7100/GMA X3100, or Turion X2/X1250). The 6515b is pretty much out of contention though since to get an X2 you have to go above the price of the T7100 in the 6710b (the cheap 6515b comes with an MK-38).

    It's not going to be a desktop replacement, but it needs to be grunty enough for serious work, and needs good enough graphics to play things up to the level of Guild Wars at native resolution (1280x800). I had a work laptop recently with a T5500 and GMA 950, and GW was playable (but not great - 20-30 FPS most of the time), so I have reasonable expectations of the 6710b. Interestingly, my home server (E2140 with G33/GMA 3000 graphics) has worse GW performance than the GMA 950 - my understanding was that GMA 3000 is basically an upgraded GMA 950, but there appear to be significant differences (GW detects the GMA 3000 as DX8 but GMA 950 as DX9, even when both have the 14.31.1 driver).

    I'll be *very* interested in the X3100 results you get under XP (with the 14.31.1 drivers).

    BTW, one of the reasons I've settled on the HP laptops is their look and feel. They are simple-looking, no-nonsense designs, that aren't going to show marks, the keyboards feel very nice, the screens are good and the sound is quite good for a laptop.
    Reply
  • JarredWalton - Tuesday, October 09, 2007 - link

    I'll spoil the results a bit and say that under XP, GMA X3100 appears to best X1250 across the board. Shockingly (pardon the pun), it even runs Bioshock - okay, so it's at about 20FPS at 800x600 (minimum detail), but at least that proves it's mostly drivers under Vista keeping it from running the latest titles. I should have the final article done next week, showing X3100 XP results. Still, for $80 more you can get HD 2300 which remains about 2-3 times as fast, or 8400 GS which is also around 2-3 times as fast. Reply
  • yyrkoon - Sunday, October 07, 2007 - link

    I honestly think your time would have been better spent covering some other aspect in the industry. Everyone knows that AMD is in a "rutt' at the moment, and this article really only tells us what we could have guesses on our own. Reasons for an article of this type in my own opinion would be; ground breaking news, or at the very elast a much shorter article just covering the import stuff such as AMDs mobile graphics superiority.

    There are lots of people out there, with myself included that would like to see you guys do an article on something like SAS IN DEPTH, or SATA Port Multipliers, with benchmarks, implementation, etc.

    Also, just going from past experience of reading your articles, I cannot help but wonder why you guys do not have any how-to's such as 'how-to overclock an Intel core 2 CPU . . ', or 'how-to build a cheap storage solution with SAS/HPM technology . . ' , etc. I honestly think filling content with things such as the above mentioned how-to's, would be far more benificial to your readers, than the obvious re-iteration of things we already know.
    Reply
  • zsdersw - Sunday, October 07, 2007 - link

    quote:

    or at the very elast a much shorter article just covering the import stuff such as AMDs mobile graphics superiority.


    Oh? So that's the only thing that's important? It's dubious that you'd pick one of the few bright spots in the article for AMD and tout it as "the important stuff".
    Reply
  • yyrkoon - Sunday, October 07, 2007 - link

    How would it be dubious that I do not care to hear about the same thing I have been hearing about for the last several months ? Reply
  • zsdersw - Sunday, October 07, 2007 - link

    What you do and do not care about is not what's dubious. What's dubious is that the only thing you supposedly regard as "the important stuff" just happens to be the one area of mobile platforms where AMD generally fares better than Intel (mobile graphics chipsets). Reply
  • yyrkoon - Monday, October 08, 2007 - link

    Look guy, if you're going to call me an AMD Nazi, fanboi, or whatever, why dont you just come out and say so, instead of making stupid comments that MAY imply *something*. You would be wrong by the way. Reply

Log in

Don't have an account? Sign up now